Hi,
I am using Geogig 1.0 on Windows 7, Oracle JRE 1.8.0_121. I have a remote repository served via Geoserver/PostGIS, containing approx 1GB of data for Antarctica.
From the documentation, I think I should be able to clone the remote repository into a directory on the local machine. I tried to do this first by issuing the command:
geogig.bat clone http://<servername>/geoserver/geogig/repos/master master
in the intended directory. I get the following output:
Cloning into 'master'...
Fetching objects from refs/heads/master
10:55:42.835 [main] ERROR o.locationtech.geogig.cli.GeogigCLI - An unhandled error occurred: org.rocksdb.RocksDBException:
▒. See the log for more details.
java.lang.RuntimeException: org.rocksdb.RocksDBException:
▒
at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-18.0.jar:na]
at org.locationtech.geogig.rocksdb.RocksdbObjectStore.putAll(RocksdbObjectStore.java:382) ~[geogig-rocksdb-1.0.jar:1.0]
at org.locationtech.geogig.storage.impl.ForwardingObjectDatabase.putAll(ForwardingObjectDatabase.java:162) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.di.ObjectDatabasePutInterceptor$GraphUpdatingObjectDatabase.putAll(ObjectDatabasePutInterceptor.java:98) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.storage.impl.ForwardingObjectDatabase.putAll(ForwardingObjectDatabase.java:162) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.remote.BinaryPackedObjects.ingest(BinaryPackedObjects.java:224) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.remote.HttpRemoteRepo.fetchMoreData(HttpRemoteRepo.java:456) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.remote.HttpRemoteRepo.fetchNewData(HttpRemoteRepo.java:208) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.FetchOp._call(FetchOp.java:257) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.FetchOp._call(FetchOp.java:48) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.repository.AbstractGeoGigOp.call(AbstractGeoGigOp.java:153) ~[geogig-api-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.CloneOp._call(CloneOp.java:163) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.CloneOp._call(CloneOp.java:39) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.repository.AbstractGeoGigOp.call(AbstractGeoGigOp.java:153) ~[geogig-api-1.0.jar:1.0]
at org.locationtech.geogig.cli.porcelain.Clone.runInternal(Clone.java:164) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.AbstractCommand.run(AbstractCommand.java:68) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.GeogigCLI.executeInternal(GeogigCLI.java:528) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.GeogigCLI.execute(GeogigCLI.java:367) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.app.CLI.main(CLI.java:80) [geogig-cli-app-1.0.jar:1.0]
Caused by: org.rocksdb.RocksDBException:
▒
at org.rocksdb.RocksDB.write0(Native Method) ~[rocksdbjni-4.13.4.jar:na]
at org.rocksdb.RocksDB.write(RocksDB.java:602) ~[rocksdbjni-4.13.4.jar:na]
at org.locationtech.geogig.rocksdb.RocksdbObjectStore.putAll(RocksdbObjectStore.java:371) ~[geogig-rocksdb-1.0.jar:1.0]
... 17 common frames omitted
An unhandled error occurred: org.rocksdb.RocksDBException:
▒. See the log for more details.
The error happens consistently after fetching around 300,000 objects (out of about 1,130,000). The logfile gives no extra information over what’s pasted above.
I have tried:
geogig.bat init
in the directory as well, in case it was missing an initialised empty repo, but get the same response. With identical software versions on a Centos 7 Linux Virtual Box VM on the same Windows box, I get:
geogig clone http://<servername>/geoserver/geogig/repos/master master
Cloning into 'master'...
Fetching objects from refs/heads/master
1,129,655
Processed 1,129,697 objects. Inserted: 1,129,697. Existing: 0. Time: 21.46 min. Compressed size: 630,289,302 bytes. Uncompressed size: 1,257,346,509 bytes.
100%
Done.
The created local repo is as expected. It looks like I might have hit some size or complexity limit in the Windows case? It looks like I am using rocksdb 4.13.4.
Any help on this much appreciated.
Thanks,
David Herbert
British Antarctic Survey.