Hi Erik,
Thanks for your input. I replaced the rocksdb jar with 4.11.2 and repeated the clone – I now get:
Fetching objects from refs/heads/master
17:12:32.171 [main] ERROR o.locationtech.geogig.cli.GeogigCLI - An unhandled error occurred: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst:
The process cannot access the file because another process has locked a portion of the file.
. See the log for more details.
java.lang.RuntimeException: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process
cannot access the file because another process has locked a portion of the file.
at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-18.0.jar:na]
at org.locationtech.geogig.rocksdb.RocksdbObjectStore.putAll(RocksdbObjectStore.java:382) ~[geogig-rocksdb-1.0.jar:1.0]
at org.locationtech.geogig.storage.impl.ForwardingObjectDatabase.putAll(ForwardingObjectDatabase.java:162) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.di.ObjectDatabasePutInterceptor$GraphUpdatingObjectDatabase.putAll(ObjectDatabasePutInterceptor.java:98) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.storage.impl.ForwardingObjectDatabase.putAll(ForwardingObjectDatabase.java:162) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.remote.BinaryPackedObjects.ingest(BinaryPackedObjects.java:224) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.remote.HttpRemoteRepo.fetchMoreData(HttpRemoteRepo.java:456) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.remote.HttpRemoteRepo.fetchNewData(HttpRemoteRepo.java:208) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.FetchOp._call(FetchOp.java:257) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.FetchOp._call(FetchOp.java:48) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.repository.AbstractGeoGigOp.call(AbstractGeoGigOp.java:153) ~[geogig-api-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.CloneOp._call(CloneOp.java:163) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.porcelain.CloneOp._call(CloneOp.java:39) ~[geogig-core-1.0.jar:1.0]
at org.locationtech.geogig.repository.AbstractGeoGigOp.call(AbstractGeoGigOp.java:153) ~[geogig-api-1.0.jar:1.0]
at org.locationtech.geogig.cli.porcelain.Clone.runInternal(Clone.java:164) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.AbstractCommand.run(AbstractCommand.java:68) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.GeogigCLI.executeInternal(GeogigCLI.java:528) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.GeogigCLI.execute(GeogigCLI.java:367) ~[geogig-cli-1.0.jar:1.0]
at org.locationtech.geogig.cli.app.CLI.main(CLI.java:80) [geogig-cli-app-1.0.jar:1.0]
Caused by: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process cannot access
the file because another process has locked a portion of the file.
at org.rocksdb.RocksDB.write0(Native Method) ~[rocksdbjni-4.11.2.jar:na]
at org.rocksdb.RocksDB.write(RocksDB.java:555) ~[rocksdbjni-4.11.2.jar:na]
at org.locationtech.geogig.rocksdb.RocksdbObjectStore.putAll(RocksdbObjectStore.java:371) ~[geogig-rocksdb-1.0.jar:1.0]
... 17 common frames omitted
An unhandled error occurred: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process
cannot access the file because another process has locked a portion of the file.
. See the log for more details.
The log contains nothing more than this, apart from three occurrences of:
2017-05-03 17:17:07,712 INFO [main] o.l.g.p.CreateDeduplicator [CreateDeduplicator.java:40] No DeduplicationService service found, using default heap based one
Before the stacktrace. I guess this new error reveals it’s some kind of file locking issue.
Thanks,
David.