use of org.apache.sis.internal.storage.io.Region in project sis by apache.
the class VariableInfo method readArray.
/**
* Reads the data from this variable and returns them as an array of a Java primitive type.
* Multi-dimensional variables are flattened as a one-dimensional array (wrapped in a vector).
* Fill values/missing values are replaced by NaN if {@link #hasRealValues()} is {@code true}.
* Array elements are in "natural" order (inverse of netCDF order).
*
* @param area indices (in "natural" order) of cell values to read, or {@code null} for whole variable.
* @param subsampling subsampling along each dimension, or {@code null} if none. Ignored if {@code area} is null.
* @return the data as an array of a Java primitive type.
* @throws ArithmeticException if the size of the variable exceeds {@link Integer#MAX_VALUE}, or other overflow occurs.
*
* @see #read()
* @see #read(GridExtent, int[])
*/
private Object readArray(final GridExtent area, int[] subsampling) throws IOException, DataStoreException {
if (reader == null) {
throw new DataStoreContentException(unknownType());
}
final int dimension = dimensions.length;
final long[] lower = new long[dimension];
final long[] upper = new long[dimension];
final long[] size = (area != null) ? new long[dimension] : upper;
/*
* NetCDF sorts datas in reverse dimension order. Example:
*
* DIMENSIONS:
* time: 3
* lat : 2
* lon : 4
*
* VARIABLES:
* temperature (time,lat,lon)
*
* DATA INDICES:
* (0,0,0) (0,0,1) (0,0,2) (0,0,3)
* (0,1,0) (0,1,1) (0,1,2) (0,1,3)
* (1,0,0) (1,0,1) (1,0,2) (1,0,3)
* (1,1,0) (1,1,1) (1,1,2) (1,1,3)
* (2,0,0) (2,0,1) (2,0,2) (2,0,3)
* (2,1,0) (2,1,1) (2,1,2) (2,1,3)
*/
for (int i = 0; i < dimension; i++) {
size[i] = dimensions[(dimension - 1) - i].length();
if (area != null) {
lower[i] = area.getLow(i);
upper[i] = Math.incrementExact(area.getHigh(i));
}
}
if (subsampling == null) {
subsampling = new int[dimension];
Arrays.fill(subsampling, 1);
}
final Region region = new Region(size, lower, upper, subsampling);
/*
* If this variable uses the unlimited dimension, we have to skip the records of all other unlimited variables
* before to reach the next record of this variable. Current implementation can do that only if the number of
* bytes to skip is a multiple of the data type size. It should be the case most of the time because variables
* in netCDF files have a 4 bytes padding. It may not work however if the variable uses {@code long} or
* {@code double} type.
*/
if (isUnlimited()) {
if (offsetToNextRecord < 0) {
throw canNotComputePosition(null);
}
region.setAdditionalByteOffset(dimensions.length - 1, offsetToNextRecord);
}
Object array = reader.read(region);
replaceNaN(array);
if (area == null && array instanceof double[]) {
/*
* If we can convert a double[] array to a float[] array, we should do that before
* to invoke `setValues(array)` - we can not rely on data.compress(tolerance). The
* reason is because we assume that float[] arrays are accurate in base 10 even if
* the data were originally stored as doubles. The Vector class does not make such
* assumption since it is specific to what we observe with netCDF files. To enable
* this assumption, we need to convert to float[] before createDecimalVector(…).
*/
final float[] copy = ArraysExt.copyAsFloatsIfLossless((double[]) array);
if (copy != null)
array = copy;
}
return array;
}
use of org.apache.sis.internal.storage.io.Region in project sis by apache.
the class DataSubset method readSlice.
/**
* Reads a two-dimensional slice of the data cube from the given input channel. This method is usually
* invoked for reading the tile in full, in which case the {@code lower} argument is (0,0) and the
* {@code upper} argument is the tile size. But those arguments may identify a smaller region if the
* {@link DataSubset} contains only one (potentially large) tile.
*
* <p>The length of {@code lower}, {@code upper} and {@code subsampling} arrays shall be 2.</p>
*
* <h4>Default implementation</h4>
* The default implementation in this base class assumes uncompressed data without band subset.
* Subsampling on the <var>X</var> axis is not supported if the image has interleaved pixels.
* Packed pixels (é.g. bilevel images with 8 pixels per byte) are not supported.
* Those restrictions are verified by {@link DataCube#canReadDirect(TiledGridResource.Subset)}.
* Subclasses must override for handling decompression or for resolving above-cited limitations.
*
* @todo It is possible to relax a little bit some restrictions. If the tile width is a divisor
* of the sample size, we could round {@code lower[0]} and {@code upper[0]} to a multiple
* of {@code sampleSize}. We would need to adjust the coordinates of returned image accordingly.
* This adjustment need to be done by the caller.
*
* @param offsets position in the channel where tile data begins, one value per bank.
* @param byteCounts number of bytes for the compressed tile data, one value per bank.
* @param lower (<var>x</var>, <var>y</var>) coordinates of the first pixel to read relative to the tile.
* @param upper (<var>x</var>, <var>y</var>) coordinates after the last pixel to read relative to the tile.
* @param subsampling (<var>sx</var>, <var>sy</var>) subsampling factors.
* @param location pixel coordinates in the upper-left corner of the tile to return.
* @return a single tile decoded from the GeoTIFF file.
* @throws IOException if an I/O error occurred.
* @throws DataStoreException if a logical error occurred.
* @throws RuntimeException if the Java2D image can not be created for another reason
* (too many exception types to list them all).
*
* @see DataCube#canReadDirect(TiledGridResource.Subset)
*/
Raster readSlice(final long[] offsets, final long[] byteCounts, final long[] lower, final long[] upper, final int[] subsampling, final Point location) throws IOException, DataStoreException {
final DataType type = getDataType();
// Assumed same as `SampleModel.getSampleSize(…)` by pre-conditions.
final int sampleSize = type.size();
final long width = subtractExact(upper[X_DIMENSION], lower[X_DIMENSION]);
final long height = subtractExact(upper[Y_DIMENSION], lower[Y_DIMENSION]);
/*
* The number of bytes to read should not be greater than `byteCount`. It may be smaller however if only
* a subregion is read. Note that the `length` value may be different than `capacity` if the tile to read
* is smaller than the "standard" tile size of the image. It happens often when reading the last strip.
* This length is used only for verification purpose so it does not need to be exact.
*/
final long length = ceilDiv(width * height * sourcePixelStride * sampleSize, Byte.SIZE);
final long[] size = new long[] { multiplyFull(sourcePixelStride, getTileSize(X_DIMENSION)), getTileSize(Y_DIMENSION) };
/*
* If we use an interleaved sample model, each "element" from `HyperRectangleReader` perspective is actually a
* group of `sourcePixelStride` values. Note that in such case, we can not handle subsampling on the first axis.
* Such case should be handled by the `CompressedSubset` subclass instead, even if there is no compression.
*/
assert sourcePixelStride == 1 || subsampling[X_DIMENSION] == 1;
lower[X_DIMENSION] *= sourcePixelStride;
upper[X_DIMENSION] *= sourcePixelStride;
/*
* Read each plane ("banks" in Java2D terminology). Note that a single bank contains all bands
* in the interleaved sample model case. This block assumes that each bank element contains
* exactly one sample value (verified by assertion), as documented in the Javadoc of this method.
* If that assumption was not true, we would have to adjust `capacity`, `lower[0]` and `upper[0]`
* (we may do that as an optimization in a future version).
*/
final HyperRectangleReader hr = new HyperRectangleReader(ImageUtilities.toNumberEnum(type.toDataBufferType()), input());
final Region region = new Region(size, lower, upper, subsampling);
final Buffer[] banks = new Buffer[numBanks];
for (int b = 0; b < numBanks; b++) {
if (b < byteCounts.length && length > byteCounts[b]) {
throw new DataStoreContentException(source.reader.resources().getString(Resources.Keys.UnexpectedTileLength_2, length, byteCounts[b]));
}
hr.setOrigin(offsets[b]);
// See above comment.
assert model.getSampleSize(b) == sampleSize;
final Buffer bank = hr.readAsBuffer(region, getBankCapacity(1));
fillRemainingRows(bank);
banks[b] = bank;
}
final DataBuffer buffer = RasterFactory.wrap(type, banks);
return Raster.createWritableRaster(model, buffer, location);
}
Aggregations