Search in sources :

Example 1 with HiveCompactReaderFactory

use of org.apache.flink.connectors.hive.read.HiveCompactReaderFactory in project flink by apache.

the class HiveDeserializeExceptionTest method parameters.

@Parameterized.Parameters(name = "{1}")
public static Object[] parameters() {
    HiveWriterFactory writerFactory = new HiveWriterFactory(new JobConf(), HiveIgnoreKeyTextOutputFormat.class, new SerDeInfo(), TableSchema.builder().build(), new String[0], new Properties(), HiveShimLoader.loadHiveShim(HiveShimLoader.getHiveVersion()), false);
    HiveCompactReaderFactory compactReaderFactory = new HiveCompactReaderFactory(new StorageDescriptor(), new Properties(), new JobConf(), new CatalogTableImpl(TableSchema.builder().build(), Collections.emptyMap(), null), HiveShimLoader.getHiveVersion(), RowType.of(DataTypes.INT().getLogicalType()), false);
    HiveSourceBuilder builder = new HiveSourceBuilder(new JobConf(), new Configuration(), new ObjectPath("default", "foo"), HiveShimLoader.getHiveVersion(), new CatalogTableImpl(TableSchema.builder().field("i", DataTypes.INT()).build(), Collections.emptyMap(), null));
    builder.setPartitions(Collections.singletonList(new HiveTablePartition(new StorageDescriptor(), new Properties())));
    HiveSource<RowData> hiveSource = builder.buildWithDefaultBulkFormat();
    return new Object[][] { new Object[] { writerFactory, writerFactory.getClass().getSimpleName() }, new Object[] { compactReaderFactory, compactReaderFactory.getClass().getSimpleName() }, new Object[] { hiveSource, hiveSource.getClass().getSimpleName() } };
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) Configuration(org.apache.flink.configuration.Configuration) SerDeInfo(org.apache.hadoop.hive.metastore.api.SerDeInfo) StorageDescriptor(org.apache.hadoop.hive.metastore.api.StorageDescriptor) HiveCompactReaderFactory(org.apache.flink.connectors.hive.read.HiveCompactReaderFactory) Properties(java.util.Properties) RowData(org.apache.flink.table.data.RowData) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) HiveWriterFactory(org.apache.flink.connectors.hive.write.HiveWriterFactory) JobConf(org.apache.hadoop.mapred.JobConf)

Aggregations

Properties (java.util.Properties)1 Configuration (org.apache.flink.configuration.Configuration)1 HiveCompactReaderFactory (org.apache.flink.connectors.hive.read.HiveCompactReaderFactory)1 HiveWriterFactory (org.apache.flink.connectors.hive.write.HiveWriterFactory)1 CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)1 ObjectPath (org.apache.flink.table.catalog.ObjectPath)1 RowData (org.apache.flink.table.data.RowData)1 SerDeInfo (org.apache.hadoop.hive.metastore.api.SerDeInfo)1 StorageDescriptor (org.apache.hadoop.hive.metastore.api.StorageDescriptor)1 JobConf (org.apache.hadoop.mapred.JobConf)1