Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geotrellis-user] Data Missing when extracting pixel value from Raster

The issue is that the yellow dots should be fully overlapped with the black points (We are simply extracting Lat/long and elevation values from a Raster Image). The portion of the image where yellow dots are not overlapped means the data has not been extracted for those points. The no data has been handled in line "Double.isNaN(t.getDouble(j, i)". There is nothing in-memory We have converted everything back into disk and visualizing the image on QGIS, I have updated the code snippet with comments which might help you in further understanding. I appreciate you helping on this
    public void testMethod(String rastedImageDir, String TAB_SEP, JavaSparkContext javaSparkContext) {
        // List of file supported
        List<String> tiffExtensions = new ArrayList<>();
        tiffExtensions.add(".tif");
        tiffExtensions.add(".TIF");
        tiffExtensions.add(".tiff");
        tiffExtensions.add(".TIFF");

        // Creation of Options object to be passed to be used in  HadoopGeoTiffRDD for reading Raster
        final scala.Option<CRS> crsNone = scala.Option.apply(null);
        final scala.Option<Object> objectNone = scala.Option.apply(null);
        final scala.Option<Object> numPartitionsObject = scala.Option.apply(new Integer(10));
        final scala.Option<Object> tileSize = scala.Option.apply(Integer.parseInt("256"));
        final scala.Option<Object> partitionBytes = scala.Option.apply(128l * 1024 * 1024);
        HadoopGeoTiffRDD.Options options = new HadoopGeoTiffRDD.Options$().apply(
                JavaConverters.asScalaIteratorConverter(tiffExtensions.iterator()).asScala().toSeq(), crsNone,
                HadoopGeoTiffRDD.GEOTIFF_TIME_TAG_DEFAULT(), HadoopGeoTiffRDD.GEOTIFF_TIME_FORMAT_DEFAULT(), tileSize,
                numPartitionsObject, partitionBytes, objectNone);
        // End of creation of Options object to be passed to be used in  HadoopGeoTiffRDD
       
       
       
        // Read the raster file from the directory and use the options object created in previous step
        RDD<Tuple2<ProjectedExtent, Tile>> rasterImageRdd = HadoopGeoTiffRDD.spatial(new Path(rastedImageDir), options,
                javaSparkContext.sc());
        // convert to Java RDD
        JavaRDD<Tuple2<ProjectedExtent, Tile>> rasterImageJavaRdd = rasterImageRdd.toJavaRDD();
       
        // Extract Lat/Long value as a point and elevation value of each pixel of Raster Image 
        JavaRDD<String> pixelRdd = rasterImageJavaRdd
                .flatMap(new FlatMapFunction<Tuple2<ProjectedExtent, Tile>, String>() {
                    private static final long serialVersionUID = -6395159549445351446L;

                    public Iterator<String> call(Tuple2<ProjectedExtent, Tile> v1) throws Exception {
                        ArrayList<String> list = new ArrayList<String>();
                        Tile t = v1._2;
                        ProjectedExtent projectedExtent = v1._1;
                        ProjectedRaster<CellGrid> r = new ProjectedRaster<CellGrid>(
                                new Raster<CellGrid>(t, projectedExtent.extent()), projectedExtent.crs());
                        GeometryFactory geometryFactory = new GeometryFactory(new PrecisionModel(), 4283);
                        WKTWriter wktWriter = new WKTWriter();
                        for (int i = 0; i < t.rows(); i++) {
                            for (int j = 0; j < t.cols(); j++) {
                                StringBuilder sb = new StringBuilder();
                                if (!Double.isNaN(t.getDouble(j, i))) {
                                    Double elevation = t.getDouble(j, i);
                                    Tuple2<Object, Object> longLatTupel = r.raster().rasterExtent().gridToMap(j, i);
                                    if (longLatTupel._2() != null && longLatTupel._1() != null) {
                                        Double latitude = Double.parseDouble(longLatTupel._2() + "");
                                        Double longitude = Double.parseDouble(longLatTupel._1() + "");
                                        Point point = geometryFactory.createPoint(new Coordinate(longitude, latitude));
                                        sb.append(elevation).append(TAB_SEP);
                                        sb.append(point);
                                        list.add(sb.toString());
                                    }
                                }
                            }
                        }
                        return list.iterator();
                    }
                });
    }


On Thu, Mar 28, 2019 at 1:05 AM Grigory Pomadchin <gr.pomadchin@xxxxxxxxx> wrote:
Hey Ashish,

So what is the question? Is it about the nature of the dots you see?
It’s a bit hard to parse your code and a bit unclear what’s the question.
However, I’m wondering, is this what you see with the scene you just loaded into the spark memory and that’s all?
Some of possible issues you’re hitting is a NoData issue - and mb you’re hitting NoData points and that’s why they are missing here.
if it’s possible, I would like to request some more details to answer the question.

Thanks,

Grigory

On Wed, Mar 27, 2019 at 8:25 AM Ashish Agarwal <ashishagarwal1983@xxxxxxxxx> wrote:
Hello We are trying to convert lzw compressed raster(tiff) into text format(meaning extracting each pixel centroid and band1 value.. We see that some parts of the raster is missed while conversion.. however there is a pattern in miss data (data is coming in unequal striped pattern) Please see snapshot attached .. Yellow dots shows the centroid converted while no overlap with yellow the data is missed while converting..

Code :-

    List<String> tiffExtensions = new ArrayList<>();
    tiffExtensions.add(".tif");
    tiffExtensions.add(".TIF");
    tiffExtensions.add(".tiff");
    tiffExtensions.add(".TIFF");

    final scala.Option<CRS> crsNone = scala.Option.apply(null);
    final scala.Option<Object> objectNone = scala.Option.apply(null);
    final scala.Option<Object> numPartitionsObject = scala.Option.apply(new Integer(10));
    final scala.Option<Object> tileSize = scala.Option.apply(Integer.parseInt("256"));
    final scala.Option<Object> partitionBytes = scala.Option.apply(128l * 1024 * 1024);
    HadoopGeoTiffRDD.Options options = new HadoopGeoTiffRDD.Options$().apply(
            JavaConverters.asScalaIteratorConverter(tiffExtensions.iterator()).asScala().toSeq(), crsNone,
            HadoopGeoTiffRDD.GEOTIFF_TIME_TAG_DEFAULT(), HadoopGeoTiffRDD.GEOTIFF_TIME_FORMAT_DEFAULT(), tileSize,
            numPartitionsObject, partitionBytes, objectNone);
    RDD<Tuple2<ProjectedExtent, Tile>> rasterImageRdd = HadoopGeoTiffRDD.spatial(new Path(rastedImageDir), options,
            javaSparkContext.sc());
    JavaRDD<Tuple2<ProjectedExtent, Tile>> rasterImageJavaRdd = rasterImageRdd.toJavaRDD();
    JavaRDD<String> pixelRdd = rasterImageJavaRdd
            .flatMap(new FlatMapFunction<Tuple2<ProjectedExtent, Tile>, String>() {
                private static final long serialVersionUID = -6395159549445351446L;

                public Iterator<String> call(Tuple2<ProjectedExtent, Tile> v1) throws Exception {
                    ArrayList<String> list = new ArrayList<String>();
                    Tile t = v1._2;
                    ProjectedExtent projectedExtent = v1._1;
                    ProjectedRaster<CellGrid> r = new ProjectedRaster<CellGrid>(
                            new Raster<CellGrid>(t, projectedExtent.extent()), projectedExtent.crs());
                    GeometryFactory geometryFactory = new GeometryFactory(new PrecisionModel(), 4283);
                    WKTWriter wktWriter = new WKTWriter();
                    for (int i = 0; i < t.rows(); i++) {
                        for (int j = 0; j < t.cols(); j++) {
                            StringBuilder sb = new StringBuilder();
                            if (!Double.isNaN(t.getDouble(j, i))) {
                                Double elevation = t.getDouble(j, i);
                                Tuple2<Object, Object> longLatTupel = r.raster().rasterExtent().gridToMap(j, i);
                                if (longLatTupel._2() != null && longLatTupel._1() != null) {
                                    Double latitude = Double.parseDouble(longLatTupel._2() + "");
                                    Double longitude = Double.parseDouble(longLatTupel._1() + "");
                                    Point point = geometryFactory.createPoint(new Coordinate(longitude, latitude));
                                    sb.append(elevation).append(TAB_SEP);
                                    sb.append(point);
                                    list.add(sb.toString());
                                }
                            }
                        }
                    }
                    return list.iterator();
                }
            });
--
Regards
Ashish Agarwal


Ph - +91- 9711163631
_______________________________________________
geotrellis-user mailing list
geotrellis-user@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geotrellis-user


--
Grigory Pomadchin
LinkedIn: http://www.linkedin.com/in/grpomadchin
_______________________________________________
geotrellis-user mailing list
geotrellis-user@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geotrellis-user


--
Regards
Ashish Agarwal

Ph - +91- 9711163631

Back to the top