Search in sources :

Example 51 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class FilterPushDownSpec method apply.

public static SupportsFilterPushDown.Result apply(List<RexNode> predicates, DynamicTableSource tableSource, SourceAbilityContext context) {
    if (tableSource instanceof SupportsFilterPushDown) {
        RexNodeToExpressionConverter converter = new RexNodeToExpressionConverter(new RexBuilder(FlinkTypeFactory.INSTANCE()), context.getSourceRowType().getFieldNames().toArray(new String[0]), context.getFunctionCatalog(), context.getCatalogManager(), TimeZone.getTimeZone(context.getTableConfig().getLocalTimeZone()));
        List<Expression> filters = predicates.stream().map(p -> {
            scala.Option<ResolvedExpression> expr = p.accept(converter);
            if (expr.isDefined()) {
                return expr.get();
            } else {
                throw new TableException(String.format("%s can not be converted to Expression, please make sure %s can accept %s.", p.toString(), tableSource.getClass().getSimpleName(), p.toString()));
            }
        }).collect(Collectors.toList());
        ExpressionResolver resolver = ExpressionResolver.resolverFor(context.getTableConfig(), name -> Optional.empty(), context.getFunctionCatalog().asLookup(str -> {
            throw new TableException("We should not need to lookup any expressions at this point");
        }), context.getCatalogManager().getDataTypeFactory(), (sqlExpression, inputRowType, outputType) -> {
            throw new TableException("SQL expression parsing is not supported at this location.");
        }).build();
        return ((SupportsFilterPushDown) tableSource).applyFilters(resolver.resolve(filters));
    } else {
        throw new TableException(String.format("%s does not support SupportsFilterPushDown.", tableSource.getClass().getName()));
    }
}
Also used : DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) FlinkRexUtil(org.apache.flink.table.planner.plan.utils.FlinkRexUtil) JsonCreator(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.annotation.JsonCreator) RexBuilder(org.apache.calcite.rex.RexBuilder) TimeZone(java.util.TimeZone) TableException(org.apache.flink.table.api.TableException) Expression(org.apache.flink.table.expressions.Expression) FlinkTypeFactory(org.apache.flink.table.planner.calcite.FlinkTypeFactory) RowType(org.apache.flink.table.types.logical.RowType) SupportsFilterPushDown(org.apache.flink.table.connector.source.abilities.SupportsFilterPushDown) Collectors(java.util.stream.Collectors) JsonProperty(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.annotation.JsonProperty) ArrayList(java.util.ArrayList) Objects(java.util.Objects) List(java.util.List) RexNode(org.apache.calcite.rex.RexNode) ResolvedExpression(org.apache.flink.table.expressions.ResolvedExpression) ExpressionResolver(org.apache.flink.table.expressions.resolver.ExpressionResolver) JavaScalaConversionUtil(org.apache.flink.table.planner.utils.JavaScalaConversionUtil) Optional(java.util.Optional) RexNodeToExpressionConverter(org.apache.flink.table.planner.plan.utils.RexNodeToExpressionConverter) Preconditions.checkNotNull(org.apache.flink.util.Preconditions.checkNotNull) JsonTypeName(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.annotation.JsonTypeName) TableException(org.apache.flink.table.api.TableException) RexNodeToExpressionConverter(org.apache.flink.table.planner.plan.utils.RexNodeToExpressionConverter) Expression(org.apache.flink.table.expressions.Expression) ResolvedExpression(org.apache.flink.table.expressions.ResolvedExpression) RexBuilder(org.apache.calcite.rex.RexBuilder) SupportsFilterPushDown(org.apache.flink.table.connector.source.abilities.SupportsFilterPushDown) ExpressionResolver(org.apache.flink.table.expressions.resolver.ExpressionResolver)

Example 52 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class FactoryUtilTest method testDiscoveryForSeparateSourceSinkFactory.

@Test
public void testDiscoveryForSeparateSourceSinkFactory() {
    final Map<String, String> options = createAllOptions();
    // the "test" source and sink factory is not in one factory class
    // see TestDynamicTableSinkFactory and TestDynamicTableSourceFactory
    options.put("connector", "test");
    final DynamicTableSource actualSource = createTableSource(SCHEMA, options);
    final DynamicTableSource expectedSource = new DynamicTableSourceMock("MyTarget", null, new DecodingFormatMock(",", false), new DecodingFormatMock("|", true));
    assertThat(actualSource).isEqualTo(expectedSource);
    final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
    final DynamicTableSink expectedSink = new DynamicTableSinkMock("MyTarget", 1000L, new EncodingFormatMock(","), new EncodingFormatMock("|"));
    assertThat(actualSink).isEqualTo(expectedSink);
}
Also used : EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) DecodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) DynamicTableSourceMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSourceMock) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) DynamicTableSinkMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSinkMock) Test(org.junit.jupiter.api.Test)

Example 53 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class FactoryUtilTest method testAlternativeValueFormat.

@Test
public void testAlternativeValueFormat() {
    final Map<String, String> options = createAllOptions();
    options.remove("value.format");
    options.remove("value.test-format.delimiter");
    options.remove("value.test-format.fail-on-missing");
    options.put("format", "test-format");
    options.put("test-format.delimiter", ";");
    options.put("test-format.fail-on-missing", "true");
    final DynamicTableSource actualSource = createTableSource(SCHEMA, options);
    final DynamicTableSource expectedSource = new DynamicTableSourceMock("MyTarget", null, new DecodingFormatMock(",", false), new DecodingFormatMock(";", true));
    assertThat(actualSource).isEqualTo(expectedSource);
    final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
    final DynamicTableSink expectedSink = new DynamicTableSinkMock("MyTarget", 1000L, new EncodingFormatMock(","), new EncodingFormatMock(";"));
    assertThat(actualSink).isEqualTo(expectedSink);
}
Also used : EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) DecodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) DynamicTableSourceMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSourceMock) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) DynamicTableSinkMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSinkMock) Test(org.junit.jupiter.api.Test)

Example 54 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class FactoryUtilTest method testOptionalFormat.

@Test
public void testOptionalFormat() {
    final Map<String, String> options = createAllOptions();
    options.remove("key.format");
    options.remove("key.test-format.delimiter");
    final DynamicTableSource actualSource = createTableSource(SCHEMA, options);
    final DynamicTableSource expectedSource = new DynamicTableSourceMock("MyTarget", null, null, new DecodingFormatMock("|", true));
    assertThat(actualSource).isEqualTo(expectedSource);
    final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
    final DynamicTableSink expectedSink = new DynamicTableSinkMock("MyTarget", 1000L, null, new EncodingFormatMock("|"));
    assertThat(actualSink).isEqualTo(expectedSink);
}
Also used : EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) DecodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) DynamicTableSourceMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSourceMock) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) DynamicTableSinkMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSinkMock) Test(org.junit.jupiter.api.Test)

Example 55 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class DynamicSourceUtils method convertDataStreamToRel.

/**
 * Converts a given {@link DataStream} to a {@link RelNode}. It adds helper projections if
 * necessary.
 */
public static RelNode convertDataStreamToRel(boolean isBatchMode, ReadableConfig config, FlinkRelBuilder relBuilder, ContextResolvedTable contextResolvedTable, DataStream<?> dataStream, DataType physicalDataType, boolean isTopLevelRecord, ChangelogMode changelogMode) {
    final DynamicTableSource tableSource = new ExternalDynamicSource<>(contextResolvedTable.getIdentifier(), dataStream, physicalDataType, isTopLevelRecord, changelogMode);
    final FlinkStatistic statistic = FlinkStatistic.unknown(contextResolvedTable.getResolvedSchema()).build();
    return convertSourceToRel(isBatchMode, config, relBuilder, contextResolvedTable, statistic, Collections.emptyList(), tableSource);
}
Also used : FlinkStatistic(org.apache.flink.table.planner.plan.stats.FlinkStatistic) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource)

Aggregations

DynamicTableSource (org.apache.flink.table.connector.source.DynamicTableSource)55 Test (org.junit.Test)24 DynamicTableSink (org.apache.flink.table.connector.sink.DynamicTableSink)12 TestDynamicTableFactory (org.apache.flink.table.factories.TestDynamicTableFactory)12 Test (org.junit.jupiter.api.Test)10 RowData (org.apache.flink.table.data.RowData)9 DecodingFormatMock (org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock)8 TableSourceTable (org.apache.flink.table.planner.plan.schema.TableSourceTable)8 ResolvedSchema (org.apache.flink.table.catalog.ResolvedSchema)7 HashMap (java.util.HashMap)5 Configuration (org.apache.flink.configuration.Configuration)5 ScanTableSource (org.apache.flink.table.connector.source.ScanTableSource)5 ParameterizedTest (org.junit.jupiter.params.ParameterizedTest)5 ArrayList (java.util.ArrayList)4 JdbcConnectorOptions (org.apache.flink.connector.jdbc.internal.options.JdbcConnectorOptions)4 JdbcLookupOptions (org.apache.flink.connector.jdbc.internal.options.JdbcLookupOptions)4 CatalogTable (org.apache.flink.table.catalog.CatalogTable)4 SourceAbilitySpec (org.apache.flink.table.planner.plan.abilities.source.SourceAbilitySpec)4 List (java.util.List)3 LogicalTableScan (org.apache.calcite.rel.logical.LogicalTableScan)3