Search in sources :

Example 56 with ArrayWritable

use of org.apache.hadoop.io.ArrayWritable in project hive by apache.

the class TestStandardParquetHiveMapInspector method testNullContainer.

@Test
public void testNullContainer() {
    final ArrayWritable map = new ArrayWritable(ArrayWritable.class, null);
    assertNull("Should be null", inspector.getMapValueElement(map, new IntWritable(0)));
}
Also used : ArrayWritable(org.apache.hadoop.io.ArrayWritable) IntWritable(org.apache.hadoop.io.IntWritable) Test(org.junit.Test)

Example 57 with ArrayWritable

use of org.apache.hadoop.io.ArrayWritable in project hive by apache.

the class TestStandardParquetHiveMapInspector method testEmptyContainer.

@Test
public void testEmptyContainer() {
    final ArrayWritable map = new ArrayWritable(ArrayWritable.class, new ArrayWritable[0]);
    assertNull("Should be null", inspector.getMapValueElement(map, new IntWritable(0)));
}
Also used : ArrayWritable(org.apache.hadoop.io.ArrayWritable) IntWritable(org.apache.hadoop.io.IntWritable) Test(org.junit.Test)

Example 58 with ArrayWritable

use of org.apache.hadoop.io.ArrayWritable in project hive by apache.

the class TestAbstractParquetMapInspector method testRegularMap.

@Test
public void testRegularMap() {
    final Writable[] entry1 = new Writable[] { new IntWritable(0), new IntWritable(1) };
    final Writable[] entry2 = new Writable[] { new IntWritable(2), new IntWritable(3) };
    final ArrayWritable map = new ArrayWritable(ArrayWritable.class, new Writable[] { new ArrayWritable(Writable.class, entry1), new ArrayWritable(Writable.class, entry2) });
    final Map<Writable, Writable> expected = new HashMap<Writable, Writable>();
    expected.put(new IntWritable(0), new IntWritable(1));
    expected.put(new IntWritable(2), new IntWritable(3));
    assertEquals("Wrong size", 2, inspector.getMapSize(map));
    assertEquals("Wrong result of inspection", expected, inspector.getMap(map));
}
Also used : ArrayWritable(org.apache.hadoop.io.ArrayWritable) HashMap(java.util.HashMap) Writable(org.apache.hadoop.io.Writable) ArrayWritable(org.apache.hadoop.io.ArrayWritable) IntWritable(org.apache.hadoop.io.IntWritable) IntWritable(org.apache.hadoop.io.IntWritable) Test(org.junit.Test)

Example 59 with ArrayWritable

use of org.apache.hadoop.io.ArrayWritable in project hive by apache.

the class TestDeepParquetHiveMapInspector method testNullContainer.

@Test
public void testNullContainer() {
    final ArrayWritable map = new ArrayWritable(ArrayWritable.class, null);
    assertNull("Should be null", inspector.getMapValueElement(map, new ShortWritable((short) 0)));
}
Also used : ArrayWritable(org.apache.hadoop.io.ArrayWritable) ShortWritable(org.apache.hadoop.hive.serde2.io.ShortWritable) Test(org.junit.Test)

Example 60 with ArrayWritable

use of org.apache.hadoop.io.ArrayWritable in project hive by apache.

the class ParquetRecordReaderWrapper method next.

@Override
public boolean next(final NullWritable key, final ArrayWritable value) throws IOException {
    if (eof) {
        return false;
    }
    try {
        if (firstRecord) {
            // key & value are already read.
            firstRecord = false;
        } else if (!realReader.nextKeyValue()) {
            // strictly not required, just for consistency
            eof = true;
            return false;
        }
        final ArrayWritable tmpCurValue = realReader.getCurrentValue();
        if (value != tmpCurValue) {
            final Writable[] arrValue = value.get();
            final Writable[] arrCurrent = tmpCurValue.get();
            if (value != null && arrValue.length == arrCurrent.length) {
                System.arraycopy(arrCurrent, 0, arrValue, 0, arrCurrent.length);
            } else {
                if (arrValue.length != arrCurrent.length) {
                    throw new IOException("DeprecatedParquetHiveInput : size of object differs. Value" + " size :  " + arrValue.length + ", Current Object size : " + arrCurrent.length);
                } else {
                    throw new IOException("DeprecatedParquetHiveInput can not support RecordReaders that" + " don't return same key & value & value is null");
                }
            }
        }
        return true;
    } catch (final InterruptedException e) {
        throw new IOException(e);
    }
}
Also used : ArrayWritable(org.apache.hadoop.io.ArrayWritable) NullWritable(org.apache.hadoop.io.NullWritable) Writable(org.apache.hadoop.io.Writable) ArrayWritable(org.apache.hadoop.io.ArrayWritable) IOException(java.io.IOException)

Aggregations

ArrayWritable (org.apache.hadoop.io.ArrayWritable)72 Test (org.junit.Test)41 IntWritable (org.apache.hadoop.io.IntWritable)31 Writable (org.apache.hadoop.io.Writable)29 Path (org.apache.hadoop.fs.Path)18 DoubleWritable (org.apache.hadoop.hive.serde2.io.DoubleWritable)18 LongWritable (org.apache.hadoop.io.LongWritable)18 RecordConsumer (org.apache.parquet.io.api.RecordConsumer)18 ShortWritable (org.apache.hadoop.hive.serde2.io.ShortWritable)15 ArrayList (java.util.ArrayList)13 BytesWritable (org.apache.hadoop.io.BytesWritable)10 List (java.util.List)9 BooleanWritable (org.apache.hadoop.io.BooleanWritable)8 FloatWritable (org.apache.hadoop.io.FloatWritable)8 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)6 NullWritable (org.apache.hadoop.io.NullWritable)6 Text (org.apache.hadoop.io.Text)6 ByteWritable (org.apache.hadoop.hive.serde2.io.ByteWritable)5 PrimitiveObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)5 MapWritable (org.apache.hadoop.io.MapWritable)5