Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geogig-dev] Stacktrace when cloning a repo on Windows (Erik Merkle)

Thanks for the info David. I've created an issue in GitHub for this:

https://github.com/locationtech/geogig/issues/302

We'll try to reproduce the issue and see if we can fix it inside GeoGig itself, or determine if it might be a bug in the Windows implementation of the RocksDB library.

I'm not sure I can offer a suitable workaround, but if you want/need to have this data in a GeoGig repository on Windows, exporting the data from the remote (GeoServer) repository and importing that into a freshly init'ed RocksDB repository on Windows might work (though I can't say for sure if the issue you are having won't affect import in the same way). Also, if you choose to go this route for now, understand that the import may not import any revision history from the remote repository.

Erik Merkle
Software Engineer | Boundless


On Wed, May 3, 2017 at 11:27 AM, Herbert, David J. <darb1@xxxxxxxxx> wrote:

Hi Erik,

 

Thanks for your input.  I replaced the rocksdb jar with 4.11.2 and repeated the clone – I now get:

 

Fetching objects from refs/heads/master

17:12:32.171 [main] ERROR o.locationtech.geogig.cli.GeogigCLI - An unhandled error occurred: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process cannot access the file because another process has locked a portion of the file.

. See the log for more details.

java.lang.RuntimeException: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process cannot access the file because another process has locked a portion of the file.

 

        at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-18.0.jar:na]

        at org.locationtech.geogig.rocksdb.RocksdbObjectStore.putAll(RocksdbObjectStore.java:382) ~[geogig-rocksdb-1.0.jar:1.0]

        at org.locationtech.geogig.storage.impl.ForwardingObjectDatabase.putAll(ForwardingObjectDatabase.java:162) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.di.ObjectDatabasePutInterceptor$GraphUpdatingObjectDatabase.putAll(ObjectDatabasePutInterceptor.java:98) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.storage.impl.ForwardingObjectDatabase.putAll(ForwardingObjectDatabase.java:162) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.remote.BinaryPackedObjects.ingest(BinaryPackedObjects.java:224) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.remote.HttpRemoteRepo.fetchMoreData(HttpRemoteRepo.java:456) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.remote.HttpRemoteRepo.fetchNewData(HttpRemoteRepo.java:208) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.porcelain.FetchOp._call(FetchOp.java:257) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.porcelain.FetchOp._call(FetchOp.java:48) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.repository.AbstractGeoGigOp.call(AbstractGeoGigOp.java:153) ~[geogig-api-1.0.jar:1.0]

        at org.locationtech.geogig.porcelain.CloneOp._call(CloneOp.java:163) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.porcelain.CloneOp._call(CloneOp.java:39) ~[geogig-core-1.0.jar:1.0]

        at org.locationtech.geogig.repository.AbstractGeoGigOp.call(AbstractGeoGigOp.java:153) ~[geogig-api-1.0.jar:1.0]

        at org.locationtech.geogig.cli.porcelain.Clone.runInternal(Clone.java:164) ~[geogig-cli-1.0.jar:1.0]

        at org.locationtech.geogig.cli.AbstractCommand.run(AbstractCommand.java:68) ~[geogig-cli-1.0.jar:1.0]

        at org.locationtech.geogig.cli.GeogigCLI.executeInternal(GeogigCLI.java:528) ~[geogig-cli-1.0.jar:1.0]

        at org.locationtech.geogig.cli.GeogigCLI.execute(GeogigCLI.java:367) ~[geogig-cli-1.0.jar:1.0]

        at org.locationtech.geogig.cli.app.CLI.main(CLI.java:80) [geogig-cli-app-1.0.jar:1.0]

Caused by: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process cannot access the file because another process has locked a portion of the file.

 

        at org.rocksdb.RocksDB.write0(Native Method) ~[rocksdbjni-4.11.2.jar:na]

        at org.rocksdb.RocksDB.write(RocksDB.java:555) ~[rocksdbjni-4.11.2.jar:na]

        at org.locationtech.geogig.rocksdb.RocksdbObjectStore.putAll(RocksdbObjectStore.java:371) ~[geogig-rocksdb-1.0.jar:1.0]

       ... 17 common frames omitted

An unhandled error occurred: org.rocksdb.RocksDBException: IO error: Failed to FlushViewOfFile: C:\Users\DARB1.BS-DARB1-L5\Documents\projects\add\data\working\master\.geogig\objects.rocksdb/000015.sst: The process cannot access the file because another process has locked a portion of the file.

. See the log for more details.

 

The log contains nothing more than this, apart from three occurrences of:

 

2017-05-03 17:17:07,712 INFO [main] o.l.g.p.CreateDeduplicator [CreateDeduplicator.java:40] No DeduplicationService service found, using default heap based one

 

Before the stacktrace.  I guess this new error reveals it’s some kind of file locking issue.

 

Thanks,


David.

 


This message (and any attachments) is for the recipient only. NERC is subject to the Freedom of Information Act 2000 and the contents of this email and any reply you make may be disclosed by NERC unless it is exempt from release under the Act. Any material supplied to NERC may be stored in an electronic records management system.

_______________________________________________
geogig-dev mailing list
geogig-dev@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geogig-dev


Back to the top