Hello Felipe,
thanks for reaching out to us.
I am so glad you guys have implemented such a workflow and it's working for you.
I know it's not optimal in terms of user friendly-ness.
Unfortunatelly, it's not easy to implement an easier workflow without more tools that wrap geogig. Also, from the database point of view, a geogig's premise is to be unobtrusive. For example, we don't want to force changes to your postgis schema, add triggers, table fields, etc, just to support the tool used to track your data's history/lineage.
That doesn't mean it can't be done. We do add triggers and extra tables to the geopackages created with the "gpkg export -i" command that allows us to then import only what have changed back to the repository once the field work is done. Not doing so for PostGIS databases because those generally are more sensitive and DBA's don't like an external tool to touch their databases. But potentially, can be done.
Which actually makes me think if you would like to try that approach? The field data collectors using a geopackage with geogig's interchange format enabled to do the field work, and synchronize back to the repository once back to the office?
Either way, your workflow seems just fine (albeit forcedly cumbersome until we manage to provide better tools than the command line?)
Question: have you tried geogig 1.2.0 already? We've made extensive changes to improve the performance of the local to local (i.e. non http) remotes interaction, which should speed up fetch/clone/push/pull considerably in your case. We'd love to hear back from someone like you using it on real life, especially if you find some problem/defect with it.
Hope that helps, best regards and feel free to keep in touch through this list.
Gabriel.