Search in sources :

Example 1 with TableSink

use of org.apache.flink.table.sinks.TableSink in project flink by apache.

the class HiveTableFactoryTest method testGenericTable.

@Test
public void testGenericTable() throws Exception {
    final TableSchema schema = TableSchema.builder().field("name", DataTypes.STRING()).field("age", DataTypes.INT()).build();
    catalog.createDatabase("mydb", new CatalogDatabaseImpl(new HashMap<>(), ""), true);
    final Map<String, String> options = Collections.singletonMap(FactoryUtil.CONNECTOR.key(), "COLLECTION");
    final CatalogTable table = new CatalogTableImpl(schema, options, "csv table");
    catalog.createTable(new ObjectPath("mydb", "mytable"), table, true);
    final Optional<TableFactory> tableFactoryOpt = catalog.getTableFactory();
    assertTrue(tableFactoryOpt.isPresent());
    final HiveTableFactory tableFactory = (HiveTableFactory) tableFactoryOpt.get();
    final TableSource tableSource = tableFactory.createTableSource(new TableSourceFactoryContextImpl(ObjectIdentifier.of("mycatalog", "mydb", "mytable"), table, new Configuration(), false));
    assertTrue(tableSource instanceof StreamTableSource);
    final TableSink tableSink = tableFactory.createTableSink(new TableSinkFactoryContextImpl(ObjectIdentifier.of("mycatalog", "mydb", "mytable"), table, new Configuration(), true, false));
    assertTrue(tableSink instanceof StreamTableSink);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) Configuration(org.apache.flink.configuration.Configuration) HashMap(java.util.HashMap) StreamTableSink(org.apache.flink.table.sinks.StreamTableSink) TableSink(org.apache.flink.table.sinks.TableSink) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) StreamTableSink(org.apache.flink.table.sinks.StreamTableSink) CatalogTable(org.apache.flink.table.catalog.CatalogTable) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) StreamTableSource(org.apache.flink.table.sources.StreamTableSource) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl) TableSinkFactoryContextImpl(org.apache.flink.table.factories.TableSinkFactoryContextImpl) TableSource(org.apache.flink.table.sources.TableSource) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) StreamTableSource(org.apache.flink.table.sources.StreamTableSource) TableFactory(org.apache.flink.table.factories.TableFactory) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) TableSourceFactoryContextImpl(org.apache.flink.table.factories.TableSourceFactoryContextImpl) Test(org.junit.Test)

Example 2 with TableSink

use of org.apache.flink.table.sinks.TableSink in project flink by apache.

the class TypeMappingUtilsTest method testCheckPhysicalLogicalTypeCompatible.

@Test
public void testCheckPhysicalLogicalTypeCompatible() {
    TableSchema tableSchema = TableSchema.builder().field("a", DataTypes.VARCHAR(2)).field("b", DataTypes.DECIMAL(20, 2)).build();
    TableSink tableSink = new TestTableSink(tableSchema);
    LegacyTypeInformationType legacyDataType = (LegacyTypeInformationType) tableSink.getConsumedDataType().getLogicalType();
    TypeInformation legacyTypeInfo = ((TupleTypeInfo) legacyDataType.getTypeInformation()).getTypeAt(1);
    DataType physicalType = TypeConversions.fromLegacyInfoToDataType(legacyTypeInfo);
    ResolvedSchema physicSchema = DataTypeUtils.expandCompositeTypeToSchema(physicalType);
    DataType[] logicalDataTypes = tableSchema.getFieldDataTypes();
    List<DataType> physicalDataTypes = physicSchema.getColumnDataTypes();
    for (int i = 0; i < logicalDataTypes.length; i++) {
        TypeMappingUtils.checkPhysicalLogicalTypeCompatible(physicalDataTypes.get(i).getLogicalType(), logicalDataTypes[i].getLogicalType(), "physicalField", "logicalField", false);
    }
}
Also used : TableSchema(org.apache.flink.table.api.TableSchema) DataType(org.apache.flink.table.types.DataType) TableSink(org.apache.flink.table.sinks.TableSink) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) LegacyTypeInformationType(org.apache.flink.table.types.logical.LegacyTypeInformationType) TypeInformation(org.apache.flink.api.common.typeinfo.TypeInformation) TupleTypeInfo(org.apache.flink.api.java.typeutils.TupleTypeInfo) Test(org.junit.Test)

Example 3 with TableSink

use of org.apache.flink.table.sinks.TableSink in project flink by apache.

the class CsvTableSinkFactoryTest method testAppendTableSinkFactory.

@Test
public void testAppendTableSinkFactory() {
    DescriptorProperties descriptor = createDescriptor(testingSchema);
    descriptor.putString("update-mode", "append");
    TableSink sink = createTableSink(descriptor);
    assertTrue(sink instanceof CsvTableSink);
    assertEquals(testingSchema.toRowDataType(), sink.getConsumedDataType());
}
Also used : CsvTableSink(org.apache.flink.table.sinks.CsvTableSink) DescriptorProperties(org.apache.flink.table.descriptors.DescriptorProperties) TableSink(org.apache.flink.table.sinks.TableSink) CsvTableSink(org.apache.flink.table.sinks.CsvTableSink) Test(org.junit.Test)

Example 4 with TableSink

use of org.apache.flink.table.sinks.TableSink in project flink by apache.

the class CsvTableSinkFactoryTest method testBatchTableSinkFactory.

@Test
public void testBatchTableSinkFactory() {
    DescriptorProperties descriptor = createDescriptor(testingSchema);
    TableSink sink = createTableSink(descriptor);
    assertTrue(sink instanceof CsvTableSink);
    assertEquals(testingSchema.toRowDataType(), sink.getConsumedDataType());
}
Also used : CsvTableSink(org.apache.flink.table.sinks.CsvTableSink) DescriptorProperties(org.apache.flink.table.descriptors.DescriptorProperties) TableSink(org.apache.flink.table.sinks.TableSink) CsvTableSink(org.apache.flink.table.sinks.CsvTableSink) Test(org.junit.Test)

Aggregations

TableSink (org.apache.flink.table.sinks.TableSink)4 Test (org.junit.Test)4 TableSchema (org.apache.flink.table.api.TableSchema)2 DescriptorProperties (org.apache.flink.table.descriptors.DescriptorProperties)2 CsvTableSink (org.apache.flink.table.sinks.CsvTableSink)2 HashMap (java.util.HashMap)1 TypeInformation (org.apache.flink.api.common.typeinfo.TypeInformation)1 TupleTypeInfo (org.apache.flink.api.java.typeutils.TupleTypeInfo)1 Configuration (org.apache.flink.configuration.Configuration)1 CatalogDatabaseImpl (org.apache.flink.table.catalog.CatalogDatabaseImpl)1 CatalogTable (org.apache.flink.table.catalog.CatalogTable)1 CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)1 ObjectPath (org.apache.flink.table.catalog.ObjectPath)1 ResolvedCatalogTable (org.apache.flink.table.catalog.ResolvedCatalogTable)1 ResolvedSchema (org.apache.flink.table.catalog.ResolvedSchema)1 DynamicTableSink (org.apache.flink.table.connector.sink.DynamicTableSink)1 DynamicTableSource (org.apache.flink.table.connector.source.DynamicTableSource)1 TableFactory (org.apache.flink.table.factories.TableFactory)1 TableSinkFactoryContextImpl (org.apache.flink.table.factories.TableSinkFactoryContextImpl)1 TableSourceFactoryContextImpl (org.apache.flink.table.factories.TableSourceFactoryContextImpl)1