Search in sources :

Example 1 with OperationTreeBuilder

use of org.apache.flink.table.operations.utils.OperationTreeBuilder in project flink by apache.

the class AbstractStreamTableEnvironmentImpl method toStreamInternal.

protected <T> DataStream<T> toStreamInternal(Table table, SchemaTranslator.ProducingResult schemaTranslationResult, @Nullable ChangelogMode changelogMode) {
    final CatalogManager catalogManager = getCatalogManager();
    final OperationTreeBuilder operationTreeBuilder = getOperationTreeBuilder();
    final QueryOperation projectOperation = schemaTranslationResult.getProjections().map(projections -> operationTreeBuilder.project(projections.stream().map(ApiExpressionUtils::unresolvedRef).collect(Collectors.toList()), table.getQueryOperation())).orElseGet(table::getQueryOperation);
    final ResolvedCatalogTable resolvedCatalogTable = catalogManager.resolveCatalogTable(new ExternalCatalogTable(schemaTranslationResult.getSchema()));
    final ExternalModifyOperation modifyOperation = new ExternalModifyOperation(ContextResolvedTable.anonymous("datastream_sink", resolvedCatalogTable), projectOperation, changelogMode, schemaTranslationResult.getPhysicalDataType().orElseGet(() -> resolvedCatalogTable.getResolvedSchema().toPhysicalRowDataType()));
    return toStreamInternal(table, modifyOperation);
}
Also used : DataType(org.apache.flink.table.types.DataType) CatalogManager(org.apache.flink.table.catalog.CatalogManager) ModifyOperation(org.apache.flink.table.operations.ModifyOperation) QueryOperation(org.apache.flink.table.operations.QueryOperation) Schema(org.apache.flink.table.api.Schema) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) DataStreamQueryOperation(org.apache.flink.table.operations.DataStreamQueryOperation) Tuple2(org.apache.flink.api.java.tuple.Tuple2) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) ChangelogMode(org.apache.flink.table.connector.ChangelogMode) ExecutorFactory(org.apache.flink.table.delegation.ExecutorFactory) TupleTypeInfo(org.apache.flink.api.java.typeutils.TupleTypeInfo) Types(org.apache.flink.table.api.Types) FunctionCatalog(org.apache.flink.table.catalog.FunctionCatalog) Planner(org.apache.flink.table.delegation.Planner) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation) Expression(org.apache.flink.table.expressions.Expression) TableEnvironmentImpl(org.apache.flink.table.api.internal.TableEnvironmentImpl) OperationTreeBuilder(org.apache.flink.table.operations.utils.OperationTreeBuilder) TypeInformation(org.apache.flink.api.common.typeinfo.TypeInformation) ExternalModifyOperation(org.apache.flink.table.operations.ExternalModifyOperation) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) Nullable(javax.annotation.Nullable) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) TimeCharacteristic(org.apache.flink.streaming.api.TimeCharacteristic) SchemaTranslator(org.apache.flink.table.catalog.SchemaTranslator) ModuleManager(org.apache.flink.table.module.ModuleManager) TableConfig(org.apache.flink.table.api.TableConfig) TableException(org.apache.flink.table.api.TableException) Table(org.apache.flink.table.api.Table) StreamExecutorFactory(org.apache.flink.table.delegation.StreamExecutorFactory) Preconditions(org.apache.flink.util.Preconditions) Collectors(java.util.stream.Collectors) FieldInfoUtils(org.apache.flink.table.typeutils.FieldInfoUtils) DataStream(org.apache.flink.streaming.api.datastream.DataStream) ExternalCatalogTable(org.apache.flink.table.catalog.ExternalCatalogTable) List(java.util.List) TypeExtractor(org.apache.flink.api.java.typeutils.TypeExtractor) FactoryUtil(org.apache.flink.table.factories.FactoryUtil) ValidationException(org.apache.flink.table.api.ValidationException) Executor(org.apache.flink.table.delegation.Executor) ApiExpressionUtils(org.apache.flink.table.expressions.ApiExpressionUtils) Optional(java.util.Optional) Internal(org.apache.flink.annotation.Internal) TypeConversions(org.apache.flink.table.types.utils.TypeConversions) Transformation(org.apache.flink.api.dag.Transformation) Collections(java.util.Collections) StreamExecutionEnvironment(org.apache.flink.streaming.api.environment.StreamExecutionEnvironment) ExternalModifyOperation(org.apache.flink.table.operations.ExternalModifyOperation) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) ExternalCatalogTable(org.apache.flink.table.catalog.ExternalCatalogTable) OperationTreeBuilder(org.apache.flink.table.operations.utils.OperationTreeBuilder) ApiExpressionUtils(org.apache.flink.table.expressions.ApiExpressionUtils) CatalogManager(org.apache.flink.table.catalog.CatalogManager) QueryOperation(org.apache.flink.table.operations.QueryOperation) DataStreamQueryOperation(org.apache.flink.table.operations.DataStreamQueryOperation) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation)

Example 2 with OperationTreeBuilder

use of org.apache.flink.table.operations.utils.OperationTreeBuilder in project flink by apache.

the class AbstractStreamTableEnvironmentImpl method fromStreamInternal.

protected <T> Table fromStreamInternal(DataStream<T> dataStream, @Nullable Schema schema, @Nullable String viewPath, ChangelogMode changelogMode) {
    Preconditions.checkNotNull(dataStream, "Data stream must not be null.");
    Preconditions.checkNotNull(changelogMode, "Changelog mode must not be null.");
    if (dataStream.getExecutionEnvironment() != executionEnvironment) {
        throw new ValidationException("The DataStream's StreamExecutionEnvironment must be identical to the one that " + "has been passed to the StreamTableEnvironment during instantiation.");
    }
    final CatalogManager catalogManager = getCatalogManager();
    final OperationTreeBuilder operationTreeBuilder = getOperationTreeBuilder();
    final SchemaTranslator.ConsumingResult schemaTranslationResult = SchemaTranslator.createConsumingResult(catalogManager.getDataTypeFactory(), dataStream.getType(), schema);
    final ResolvedCatalogTable resolvedCatalogTable = catalogManager.resolveCatalogTable(new ExternalCatalogTable(schemaTranslationResult.getSchema()));
    final ContextResolvedTable contextResolvedTable;
    if (viewPath != null) {
        UnresolvedIdentifier unresolvedIdentifier = getParser().parseIdentifier(viewPath);
        final ObjectIdentifier objectIdentifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
        contextResolvedTable = ContextResolvedTable.temporary(objectIdentifier, resolvedCatalogTable);
    } else {
        contextResolvedTable = ContextResolvedTable.anonymous("datastream_source", resolvedCatalogTable);
    }
    final QueryOperation scanOperation = new ExternalQueryOperation<>(contextResolvedTable, dataStream, schemaTranslationResult.getPhysicalDataType(), schemaTranslationResult.isTopLevelRecord(), changelogMode);
    final List<String> projections = schemaTranslationResult.getProjections();
    if (projections == null) {
        return createTable(scanOperation);
    }
    final QueryOperation projectOperation = operationTreeBuilder.project(projections.stream().map(ApiExpressionUtils::unresolvedRef).collect(Collectors.toList()), scanOperation);
    return createTable(projectOperation);
}
Also used : ValidationException(org.apache.flink.table.api.ValidationException) ExternalCatalogTable(org.apache.flink.table.catalog.ExternalCatalogTable) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) ApiExpressionUtils(org.apache.flink.table.expressions.ApiExpressionUtils) CatalogManager(org.apache.flink.table.catalog.CatalogManager) SchemaTranslator(org.apache.flink.table.catalog.SchemaTranslator) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) OperationTreeBuilder(org.apache.flink.table.operations.utils.OperationTreeBuilder) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) QueryOperation(org.apache.flink.table.operations.QueryOperation) DataStreamQueryOperation(org.apache.flink.table.operations.DataStreamQueryOperation) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation)

Aggregations

ValidationException (org.apache.flink.table.api.ValidationException)2 CatalogManager (org.apache.flink.table.catalog.CatalogManager)2 ContextResolvedTable (org.apache.flink.table.catalog.ContextResolvedTable)2 ExternalCatalogTable (org.apache.flink.table.catalog.ExternalCatalogTable)2 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)2 ResolvedCatalogTable (org.apache.flink.table.catalog.ResolvedCatalogTable)2 SchemaTranslator (org.apache.flink.table.catalog.SchemaTranslator)2 UnresolvedIdentifier (org.apache.flink.table.catalog.UnresolvedIdentifier)2 ApiExpressionUtils (org.apache.flink.table.expressions.ApiExpressionUtils)2 DataStreamQueryOperation (org.apache.flink.table.operations.DataStreamQueryOperation)2 ExternalQueryOperation (org.apache.flink.table.operations.ExternalQueryOperation)2 QueryOperation (org.apache.flink.table.operations.QueryOperation)2 OperationTreeBuilder (org.apache.flink.table.operations.utils.OperationTreeBuilder)2 Collections (java.util.Collections)1 List (java.util.List)1 Optional (java.util.Optional)1 Collectors (java.util.stream.Collectors)1 Nullable (javax.annotation.Nullable)1 Internal (org.apache.flink.annotation.Internal)1 TypeInformation (org.apache.flink.api.common.typeinfo.TypeInformation)1