Search in sources :

Example 1 with SourceAbilityContext

use of org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext in project flink by apache.

the class DynamicTableSourceSpec method getTableSource.

private DynamicTableSource getTableSource(FlinkContext flinkContext) {
    if (tableSource == null) {
        final DynamicTableSourceFactory factory = flinkContext.getModuleManager().getFactory(Module::getTableSourceFactory).orElse(null);
        tableSource = FactoryUtil.createDynamicTableSource(factory, contextResolvedTable.getIdentifier(), contextResolvedTable.getResolvedTable(), loadOptionsFromCatalogTable(contextResolvedTable, flinkContext), flinkContext.getTableConfig().getConfiguration(), flinkContext.getClassLoader(), contextResolvedTable.isTemporary());
        if (sourceAbilities != null) {
            RowType newProducedType = (RowType) contextResolvedTable.getResolvedSchema().toSourceRowDataType().getLogicalType();
            for (SourceAbilitySpec spec : sourceAbilities) {
                SourceAbilityContext context = new SourceAbilityContext(flinkContext, newProducedType);
                spec.apply(tableSource, context);
                if (spec.getProducedType().isPresent()) {
                    newProducedType = spec.getProducedType().get();
                }
            }
        }
    }
    return tableSource;
}
Also used : DynamicTableSourceFactory(org.apache.flink.table.factories.DynamicTableSourceFactory) SourceAbilityContext(org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext) SourceAbilitySpec(org.apache.flink.table.planner.plan.abilities.source.SourceAbilitySpec) RowType(org.apache.flink.table.types.logical.RowType)

Example 2 with SourceAbilityContext

use of org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext in project flink by apache.

the class PushWatermarkIntoTableSourceScanRuleBase method getNewScan.

/**
 * It uses the input watermark expression to generate the {@link WatermarkGeneratorSupplier}.
 * After the {@link WatermarkStrategy} is pushed into the scan, it will build a new scan.
 * However, when {@link FlinkLogicalWatermarkAssigner} is the parent of the {@link
 * FlinkLogicalTableSourceScan} it should modify the rowtime type to keep the type of plan is
 * consistent. In other cases, it just keep the data type of the scan as same as before and
 * leave the work when rewriting the projection.
 *
 * <p>NOTES: the row type of the scan is not always as same as the watermark assigner. Because
 * the scan will not add the rowtime column into the row when pushing the watermark assigner
 * into the scan. In some cases, query may have computed columns defined on rowtime column. If
 * modifying the type of the rowtime(with time attribute), it will also influence the type of
 * the computed column. Therefore, if the watermark assigner is not the parent of the scan, set
 * the type of the scan as before and leave the work to projection.
 */
protected FlinkLogicalTableSourceScan getNewScan(FlinkLogicalWatermarkAssigner watermarkAssigner, RexNode watermarkExpr, FlinkLogicalTableSourceScan scan, TableConfig tableConfig, boolean useWatermarkAssignerRowType) {
    final TableSourceTable tableSourceTable = scan.getTable().unwrap(TableSourceTable.class);
    final DynamicTableSource newDynamicTableSource = tableSourceTable.tableSource().copy();
    final boolean isSourceWatermark = newDynamicTableSource instanceof SupportsSourceWatermark && hasSourceWatermarkDeclaration(watermarkExpr);
    final RelDataType newType;
    if (useWatermarkAssignerRowType) {
        // project is trivial and set rowtime type in scan
        newType = watermarkAssigner.getRowType();
    } else {
        // project add/delete columns and set the rowtime column type in project
        newType = scan.getRowType();
    }
    final RowType producedType = (RowType) FlinkTypeFactory.toLogicalType(newType);
    final SourceAbilityContext abilityContext = SourceAbilityContext.from(scan);
    final SourceAbilitySpec abilitySpec;
    if (isSourceWatermark) {
        final SourceWatermarkSpec sourceWatermarkSpec = new SourceWatermarkSpec(true, producedType);
        sourceWatermarkSpec.apply(newDynamicTableSource, abilityContext);
        abilitySpec = sourceWatermarkSpec;
    } else {
        final Duration idleTimeout = tableConfig.getConfiguration().get(ExecutionConfigOptions.TABLE_EXEC_SOURCE_IDLE_TIMEOUT);
        final long idleTimeoutMillis;
        if (!idleTimeout.isZero() && !idleTimeout.isNegative()) {
            idleTimeoutMillis = idleTimeout.toMillis();
        } else {
            idleTimeoutMillis = -1L;
        }
        final WatermarkPushDownSpec watermarkPushDownSpec = new WatermarkPushDownSpec(watermarkExpr, idleTimeoutMillis, producedType);
        watermarkPushDownSpec.apply(newDynamicTableSource, abilityContext);
        abilitySpec = watermarkPushDownSpec;
    }
    TableSourceTable newTableSourceTable = tableSourceTable.copy(newDynamicTableSource, newType, new SourceAbilitySpec[] { abilitySpec });
    return FlinkLogicalTableSourceScan.create(scan.getCluster(), scan.getHints(), newTableSourceTable);
}
Also used : WatermarkPushDownSpec(org.apache.flink.table.planner.plan.abilities.source.WatermarkPushDownSpec) SourceAbilityContext(org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext) SourceAbilitySpec(org.apache.flink.table.planner.plan.abilities.source.SourceAbilitySpec) SupportsSourceWatermark(org.apache.flink.table.connector.source.abilities.SupportsSourceWatermark) RowType(org.apache.flink.table.types.logical.RowType) RelDataType(org.apache.calcite.rel.type.RelDataType) Duration(java.time.Duration) TableSourceTable(org.apache.flink.table.planner.plan.schema.TableSourceTable) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) SourceWatermarkSpec(org.apache.flink.table.planner.plan.abilities.source.SourceWatermarkSpec)

Example 3 with SourceAbilityContext

use of org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext in project flink by apache.

the class PushProjectIntoTableSourceScanRule method onMatch.

@Override
public void onMatch(RelOptRuleCall call) {
    final LogicalProject project = call.rel(0);
    final LogicalTableScan scan = call.rel(1);
    final TableSourceTable sourceTable = scan.getTable().unwrap(TableSourceTable.class);
    final boolean supportsNestedProjection = supportsNestedProjection(sourceTable.tableSource());
    final int[] refFields = RexNodeExtractor.extractRefInputFields(project.getProjects());
    if (!supportsNestedProjection && refFields.length == scan.getRowType().getFieldCount()) {
        // There is no top-level projection and nested projections aren't supported.
        return;
    }
    final FlinkTypeFactory typeFactory = unwrapTypeFactory(scan);
    final ResolvedSchema schema = sourceTable.contextResolvedTable().getResolvedSchema();
    final RowType producedType = createProducedType(schema, sourceTable.tableSource());
    final NestedSchema projectedSchema = NestedProjectionUtil.build(getProjections(project, scan), typeFactory.buildRelNodeRowType(producedType));
    if (!supportsNestedProjection) {
        for (NestedColumn column : projectedSchema.columns().values()) {
            column.markLeaf();
        }
    }
    final List<SourceAbilitySpec> abilitySpecs = new ArrayList<>();
    final RowType newProducedType = performPushDown(sourceTable, projectedSchema, producedType, abilitySpecs);
    final DynamicTableSource newTableSource = sourceTable.tableSource().copy();
    final SourceAbilityContext context = SourceAbilityContext.from(scan);
    abilitySpecs.forEach(spec -> spec.apply(newTableSource, context));
    final RelDataType newRowType = typeFactory.buildRelNodeRowType(newProducedType);
    final TableSourceTable newSource = sourceTable.copy(newTableSource, newRowType, abilitySpecs.toArray(new SourceAbilitySpec[0]));
    final LogicalTableScan newScan = new LogicalTableScan(scan.getCluster(), scan.getTraitSet(), scan.getHints(), newSource);
    final LogicalProject newProject = project.copy(project.getTraitSet(), newScan, rewriteProjections(call, newSource, projectedSchema), project.getRowType());
    if (ProjectRemoveRule.isTrivial(newProject)) {
        call.transformTo(newScan);
    } else {
        call.transformTo(newProject);
    }
}
Also used : SourceAbilitySpec(org.apache.flink.table.planner.plan.abilities.source.SourceAbilitySpec) ArrayList(java.util.ArrayList) RowType(org.apache.flink.table.types.logical.RowType) NestedColumn(org.apache.flink.table.planner.plan.utils.NestedColumn) RelDataType(org.apache.calcite.rel.type.RelDataType) LogicalTableScan(org.apache.calcite.rel.logical.LogicalTableScan) FlinkTypeFactory(org.apache.flink.table.planner.calcite.FlinkTypeFactory) SourceAbilityContext(org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext) LogicalProject(org.apache.calcite.rel.logical.LogicalProject) TableSourceTable(org.apache.flink.table.planner.plan.schema.TableSourceTable) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) NestedSchema(org.apache.flink.table.planner.plan.utils.NestedSchema)

Aggregations

SourceAbilityContext (org.apache.flink.table.planner.plan.abilities.source.SourceAbilityContext)3 SourceAbilitySpec (org.apache.flink.table.planner.plan.abilities.source.SourceAbilitySpec)3 RowType (org.apache.flink.table.types.logical.RowType)3 RelDataType (org.apache.calcite.rel.type.RelDataType)2 DynamicTableSource (org.apache.flink.table.connector.source.DynamicTableSource)2 TableSourceTable (org.apache.flink.table.planner.plan.schema.TableSourceTable)2 Duration (java.time.Duration)1 ArrayList (java.util.ArrayList)1 LogicalProject (org.apache.calcite.rel.logical.LogicalProject)1 LogicalTableScan (org.apache.calcite.rel.logical.LogicalTableScan)1 ResolvedSchema (org.apache.flink.table.catalog.ResolvedSchema)1 SupportsSourceWatermark (org.apache.flink.table.connector.source.abilities.SupportsSourceWatermark)1 DynamicTableSourceFactory (org.apache.flink.table.factories.DynamicTableSourceFactory)1 FlinkTypeFactory (org.apache.flink.table.planner.calcite.FlinkTypeFactory)1 SourceWatermarkSpec (org.apache.flink.table.planner.plan.abilities.source.SourceWatermarkSpec)1 WatermarkPushDownSpec (org.apache.flink.table.planner.plan.abilities.source.WatermarkPushDownSpec)1 NestedColumn (org.apache.flink.table.planner.plan.utils.NestedColumn)1 NestedSchema (org.apache.flink.table.planner.plan.utils.NestedSchema)1