Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geogig-dev] GeoGig upload endpoint



On Mon, Apr 25, 2016 at 5:36 AM, Victor Olaya <volaya@xxxxxxxxxxxxxxxx> wrote:
Hey guys

I am trying to adapt the QGIS GeoGig plugin to the latest changes, and I have some questions.

1) It seems that now for a given url, there might be multiple repos being served. This is great, and now in the plugin you specify the url and it populates the list with all repos available. All endpoints for repos seem to be operative and working fine. The only thing that i havent been able to figure out is the tasks endpoints for aysnchronous download. That was working ok, but not now. Can anyone tell me how to use that now?
maybe you're calling /repos/<repo>/tasks? it should be just /tasks
 

2) I am adding the "add layer" (upload a new one to a repo) and "sync layer"(upload an existing one) functionality to the plugin. I have two questions about this:
    -How does geogig deals with a geopackage with multiple tables. A layer in QGIs corresponds to a table in the geopackage, but if there are more tables in that same geopacakge, i dont see the way to upload just the one that contains the layer data.
    -How should triggers and additional tables be added when uploading a new layer? I guess I have to add them manually to the geopackage once the layer has been uploaded the first time. Another option is to upload it and then download a new geopackage with the extra stuff, but that sounds like a bad idea to me. Any thought on this?

hmm good questions, let me write up the exact spec and hand it over to you.
 

Thanks!


On Mon, Apr 18, 2016 at 3:56 PM, Gabriel Roldan <groldan@xxxxxxxxxxxxxxxx> wrote:
Hey Victor, all, cc'ing the devel list, lets try to send tech discussions over there as much as possible, disregard most of them being held on slack/gerrit.

JD's took over GIG-155.

The way it works is by calling POST <repo end point>/import?format=gpkg&... and sending the file as a POST form field called "fileUpload". Check the "Import GeoPackage" feature report here http://dev.boundlessgeo.com/~groldan/cucumber-report/

When importing a geopackage with the geogig extension and populated audit tables, the only difference is you'll have to add the "interchange=true" argument to the URL.

That said, JD's working out the details and I anticipate we'll have to:
a- extend the op to indicate on top of which branch to import
b- better define the op's return value (report) to make sure the client can put the local geopackage in a state that's in sync with the resulting commit on the server.

but we're making progress, lets sync'up when you're back from vacation.



On Mon, Apr 18, 2016 at 5:30 AM, Victor Olaya <volaya@xxxxxxxxxxxxxxxx> wrote:
Erik, Gabriel

I didnt have time to look at this, but want to resume it once I am back from vacation. 

I see that there is no option for now to use the audit tables. Do you think it is better to wait until that is solved, or I should adapt the mechanism to work with this limitation and upload the full table?

Please, let me know the current state, so I can have all the info to continue working on this.

Thanks!

On Tue, Apr 5, 2016 at 5:54 PM, Erik Merkle <emerkle@xxxxxxxxxxxxxxxx> wrote:
Hey Voctor,

Yes, Gabriel merged the import functionality over the past weekend. It's working, but it has a couple of limitations (unless Gabriel fixed them that I don't know about).

1) When you import, it will add the geopackage data to the current working branch (there isn't a way to specify which branch you want to import into)
2) Import of Geopkg data doesn't support reading from audit tables in the geopkg file to be imported. It just adds the data it finds in the imported geopkg.

To use it, you must POST a request to the service. The most basic way is something like this (for XML responses):

POST <repository URL>/import?format=gpkg

or if you want JSON responses:

POST <repository URL>/import.json?format=gpkg

and the generic API is this:

POST <repository URL>/import[.xml|.json]?format=<format name>[&layer=<layer table name>][&dest=<destination path][&alter=<true|false>][&forceFeatureType=<true|false>][&fidAaaattribute=<attribute name>][&authorEmail=<email address>][&authorName=<author name>][&message=<commit message>]

You MUST include the file to import in the request body, named "fileUpload", encoded as a multi-part content (see the Postman screenshot below).



You can add the "authorName", "authorEmail" and "message" request parameters too if you want to set them for the commit.

Let me know if you need more.


Erik Merkle
Software Engineer | Boundless

On Tue, Apr 5, 2016 at 10:37 AM, Victor Olaya <volaya@xxxxxxxxxxxxxxxx> wrote:
Hey Erik

I have seen that there is already an endpoint to upload/import a layer
to a geogig repo. Is that working fine already?

I am back working on the geogig QGIS plugin, and that's a
functionality that i have to implement. Could you give me some example
of using that endpoint (a curl example would be fine) and whatever
extra info i need to be able to use it from QGIS?

thanks!

--
Victor Olaya
Software Engineer | Boundless
volaya@xxxxxxxxxxxxxxxx
@boundlessgeo




--
Victor Olaya






--

Gabriel Roldán
Software Developer | Boundless
groldan@xxxxxxxxxxxxxxxx
@boundlessgeo





--
Victor Olaya






--

Gabriel Roldán
Software Developer | Boundless
groldan@xxxxxxxxxxxxxxxx
@boundlessgeo



Back to the top