use of org.apache.iceberg.data.Record in project hive by apache.
the class TestDeserializer method testListDeserialize.
@Test
public void testListDeserialize() {
Schema schema = new Schema(optional(1, "list_type", Types.ListType.ofOptional(2, Types.LongType.get())));
StructObjectInspector inspector = ObjectInspectorFactory.getStandardStructObjectInspector(Arrays.asList("list_type"), Arrays.asList(ObjectInspectorFactory.getStandardListObjectInspector(PrimitiveObjectInspectorFactory.writableLongObjectInspector)));
Deserializer deserializer = new Deserializer.Builder().schema(schema).writerInspector((StructObjectInspector) IcebergObjectInspector.create(schema)).sourceInspector(inspector).build();
Record expected = GenericRecord.create(schema);
expected.set(0, Collections.singletonList(1L));
Object[] data = new Object[] { new Object[] { new LongWritable(1L) } };
Record actual = deserializer.deserialize(data);
Assert.assertEquals(expected, actual);
}
use of org.apache.iceberg.data.Record in project hive by apache.
the class TestDeserializer method testStructDeserialize.
@Test
public void testStructDeserialize() {
Deserializer deserializer = new Deserializer.Builder().schema(CUSTOMER_SCHEMA).writerInspector((StructObjectInspector) IcebergObjectInspector.create(CUSTOMER_SCHEMA)).sourceInspector(CUSTOMER_OBJECT_INSPECTOR).build();
Record expected = GenericRecord.create(CUSTOMER_SCHEMA);
expected.set(0, 1L);
expected.set(1, "Bob");
Record actual = deserializer.deserialize(new Object[] { new LongWritable(1L), new Text("Bob") });
Assert.assertEquals(expected, actual);
}
use of org.apache.iceberg.data.Record in project hive by apache.
the class TestDeserializer method testDeserializeEverySupportedType.
@Test
public void testDeserializeEverySupportedType() {
Assume.assumeFalse("No test yet for Hive3 (Date/Timestamp creation)", MetastoreUtil.hive3PresentOnClasspath());
Deserializer deserializer = new Deserializer.Builder().schema(HiveIcebergTestUtils.FULL_SCHEMA).writerInspector((StructObjectInspector) IcebergObjectInspector.create(HiveIcebergTestUtils.FULL_SCHEMA)).sourceInspector(HiveIcebergTestUtils.FULL_SCHEMA_OBJECT_INSPECTOR).build();
Record expected = HiveIcebergTestUtils.getTestRecord();
Record actual = deserializer.deserialize(HiveIcebergTestUtils.valuesForTestRecord(expected));
HiveIcebergTestUtils.assertEquals(expected, actual);
}
use of org.apache.iceberg.data.Record in project hive by apache.
the class TestDeserializer method testSchemaDeserialize.
@Test
public void testSchemaDeserialize() {
StandardStructObjectInspector schemaObjectInspector = ObjectInspectorFactory.getStandardStructObjectInspector(Arrays.asList("0:col1", "1:col2"), Arrays.asList(PrimitiveObjectInspectorFactory.writableLongObjectInspector, PrimitiveObjectInspectorFactory.writableStringObjectInspector));
Deserializer deserializer = new Deserializer.Builder().schema(CUSTOMER_SCHEMA).writerInspector((StructObjectInspector) IcebergObjectInspector.create(CUSTOMER_SCHEMA)).sourceInspector(schemaObjectInspector).build();
Record expected = GenericRecord.create(CUSTOMER_SCHEMA);
expected.set(0, 1L);
expected.set(1, "Bob");
Record actual = deserializer.deserialize(new Object[] { new LongWritable(1L), new Text("Bob") });
Assert.assertEquals(expected, actual);
}
use of org.apache.iceberg.data.Record in project hive by apache.
the class TestHiveIcebergPartitions method testMultilevelIdentityPartitionedWrite.
@Test
public void testMultilevelIdentityPartitionedWrite() throws IOException {
PartitionSpec spec = PartitionSpec.builderFor(HiveIcebergStorageHandlerTestUtils.CUSTOMER_SCHEMA).identity("customer_id").identity("last_name").build();
List<Record> records = TestHelper.generateRandomRecords(HiveIcebergStorageHandlerTestUtils.CUSTOMER_SCHEMA, 4, 0L);
Table table = testTables.createTable(shell, "partitioned_customers", HiveIcebergStorageHandlerTestUtils.CUSTOMER_SCHEMA, spec, fileFormat, records);
HiveIcebergTestUtils.validateData(table, records, 0);
}
Aggregations