Search in sources :

Example 11 with ResolvedCatalogTable

use of org.apache.flink.table.catalog.ResolvedCatalogTable in project flink by apache.

the class ContextResolvedTableJsonDeserializer method deserialize.

@Override
public ContextResolvedTable deserialize(JsonParser jsonParser, DeserializationContext ctx) throws IOException {
    final CatalogPlanRestore planRestoreOption = SerdeContext.get(ctx).getConfiguration().get(PLAN_RESTORE_CATALOG_OBJECTS);
    final CatalogManager catalogManager = SerdeContext.get(ctx).getFlinkContext().getCatalogManager();
    final ObjectNode objectNode = jsonParser.readValueAsTree();
    // Deserialize the two fields, if available
    final ObjectIdentifier identifier = JsonSerdeUtil.deserializeOptionalField(objectNode, FIELD_NAME_IDENTIFIER, ObjectIdentifier.class, jsonParser.getCodec(), ctx).orElse(null);
    ResolvedCatalogTable resolvedCatalogTable = JsonSerdeUtil.deserializeOptionalField(objectNode, FIELD_NAME_CATALOG_TABLE, ResolvedCatalogTable.class, jsonParser.getCodec(), ctx).orElse(null);
    if (identifier == null && resolvedCatalogTable == null) {
        throw new ValidationException(String.format("The input JSON is invalid because it doesn't contain '%s', nor the '%s'.", FIELD_NAME_IDENTIFIER, FIELD_NAME_CATALOG_TABLE));
    }
    if (identifier == null) {
        if (isLookupForced(planRestoreOption)) {
            throw missingIdentifier();
        }
        return ContextResolvedTable.anonymous(resolvedCatalogTable);
    }
    Optional<ContextResolvedTable> contextResolvedTableFromCatalog = isLookupEnabled(planRestoreOption) ? catalogManager.getTable(identifier) : Optional.empty();
    // If we have a schema from the plan and from the catalog, we need to check they match.
    if (contextResolvedTableFromCatalog.isPresent() && resolvedCatalogTable != null) {
        ResolvedSchema schemaFromPlan = resolvedCatalogTable.getResolvedSchema();
        ResolvedSchema schemaFromCatalog = contextResolvedTableFromCatalog.get().getResolvedSchema();
        if (!areResolvedSchemasEqual(schemaFromPlan, schemaFromCatalog)) {
            throw schemaNotMatching(identifier, schemaFromPlan, schemaFromCatalog);
        }
    }
    if (resolvedCatalogTable == null || isLookupForced(planRestoreOption)) {
        if (!isLookupEnabled(planRestoreOption)) {
            throw lookupDisabled(identifier);
        }
        // We use what is stored inside the catalog
        return contextResolvedTableFromCatalog.orElseThrow(() -> missingTableFromCatalog(identifier, isLookupForced(planRestoreOption)));
    }
    if (contextResolvedTableFromCatalog.isPresent()) {
        // SCHEMA, so we just need to return the catalog query result
        if (objectNode.at("/" + FIELD_NAME_CATALOG_TABLE + "/" + OPTIONS).isMissingNode()) {
            return contextResolvedTableFromCatalog.get();
        }
        return contextResolvedTableFromCatalog.flatMap(ContextResolvedTable::getCatalog).map(c -> ContextResolvedTable.permanent(identifier, c, resolvedCatalogTable)).orElseGet(() -> ContextResolvedTable.temporary(identifier, resolvedCatalogTable));
    }
    return ContextResolvedTable.temporary(identifier, resolvedCatalogTable);
}
Also used : CatalogManager(org.apache.flink.table.catalog.CatalogManager) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) FIELD_NAME_IDENTIFIER(org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_IDENTIFIER) Column(org.apache.flink.table.catalog.Column) ObjectNode(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode) IDENTIFIER(org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore.IDENTIFIER) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) JsonParser(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser) IOException(java.io.IOException) PLAN_COMPILE_CATALOG_OBJECTS(org.apache.flink.table.api.config.TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS) CatalogPlanRestore(org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore) CatalogPlanCompilation(org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation) Objects(java.util.Objects) OPTIONS(org.apache.flink.table.planner.plan.nodes.exec.serde.ResolvedCatalogTableJsonSerializer.OPTIONS) DeserializationContext(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext) List(java.util.List) ValidationException(org.apache.flink.table.api.ValidationException) Optional(java.util.Optional) Internal(org.apache.flink.annotation.Internal) StdDeserializer(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer) PLAN_RESTORE_CATALOG_OBJECTS(org.apache.flink.table.api.config.TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) FIELD_NAME_CATALOG_TABLE(org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_CATALOG_TABLE) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) ValidationException(org.apache.flink.table.api.ValidationException) ObjectNode(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) CatalogPlanRestore(org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) CatalogManager(org.apache.flink.table.catalog.CatalogManager) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 12 with ResolvedCatalogTable

use of org.apache.flink.table.catalog.ResolvedCatalogTable in project flink by apache.

the class QueryOperationTest method testSummaryString.

@Test
public void testSummaryString() {
    ResolvedSchema schema = ResolvedSchema.physical(Collections.singletonList("a"), Collections.singletonList(DataTypes.INT()));
    ProjectQueryOperation tableOperation = new ProjectQueryOperation(Collections.singletonList(new FieldReferenceExpression("a", DataTypes.INT(), 0, 0)), new SourceQueryOperation(ContextResolvedTable.temporary(ObjectIdentifier.of("cat1", "db1", "tab1"), new ResolvedCatalogTable(CatalogTable.of(Schema.newBuilder().build(), null, Collections.emptyList(), Collections.emptyMap()), schema))), schema);
    SetQueryOperation unionQueryOperation = new SetQueryOperation(tableOperation, tableOperation, SetQueryOperation.SetQueryOperationType.UNION, true, schema);
    assertEquals("Union: (all: [true])\n" + "    Project: (projections: [a])\n" + "        CatalogTable: (identifier: [cat1.db1.tab1], fields: [a])\n" + "    Project: (projections: [a])\n" + "        CatalogTable: (identifier: [cat1.db1.tab1], fields: [a])", unionQueryOperation.asSummaryString());
}
Also used : ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) FieldReferenceExpression(org.apache.flink.table.expressions.FieldReferenceExpression) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) Test(org.junit.Test)

Example 13 with ResolvedCatalogTable

use of org.apache.flink.table.catalog.ResolvedCatalogTable in project flink by apache.

the class AbstractStreamTableEnvironmentImpl method toStreamInternal.

protected <T> DataStream<T> toStreamInternal(Table table, SchemaTranslator.ProducingResult schemaTranslationResult, @Nullable ChangelogMode changelogMode) {
    final CatalogManager catalogManager = getCatalogManager();
    final OperationTreeBuilder operationTreeBuilder = getOperationTreeBuilder();
    final QueryOperation projectOperation = schemaTranslationResult.getProjections().map(projections -> operationTreeBuilder.project(projections.stream().map(ApiExpressionUtils::unresolvedRef).collect(Collectors.toList()), table.getQueryOperation())).orElseGet(table::getQueryOperation);
    final ResolvedCatalogTable resolvedCatalogTable = catalogManager.resolveCatalogTable(new ExternalCatalogTable(schemaTranslationResult.getSchema()));
    final ExternalModifyOperation modifyOperation = new ExternalModifyOperation(ContextResolvedTable.anonymous("datastream_sink", resolvedCatalogTable), projectOperation, changelogMode, schemaTranslationResult.getPhysicalDataType().orElseGet(() -> resolvedCatalogTable.getResolvedSchema().toPhysicalRowDataType()));
    return toStreamInternal(table, modifyOperation);
}
Also used : DataType(org.apache.flink.table.types.DataType) CatalogManager(org.apache.flink.table.catalog.CatalogManager) ModifyOperation(org.apache.flink.table.operations.ModifyOperation) QueryOperation(org.apache.flink.table.operations.QueryOperation) Schema(org.apache.flink.table.api.Schema) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) DataStreamQueryOperation(org.apache.flink.table.operations.DataStreamQueryOperation) Tuple2(org.apache.flink.api.java.tuple.Tuple2) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) ChangelogMode(org.apache.flink.table.connector.ChangelogMode) ExecutorFactory(org.apache.flink.table.delegation.ExecutorFactory) TupleTypeInfo(org.apache.flink.api.java.typeutils.TupleTypeInfo) Types(org.apache.flink.table.api.Types) FunctionCatalog(org.apache.flink.table.catalog.FunctionCatalog) Planner(org.apache.flink.table.delegation.Planner) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation) Expression(org.apache.flink.table.expressions.Expression) TableEnvironmentImpl(org.apache.flink.table.api.internal.TableEnvironmentImpl) OperationTreeBuilder(org.apache.flink.table.operations.utils.OperationTreeBuilder) TypeInformation(org.apache.flink.api.common.typeinfo.TypeInformation) ExternalModifyOperation(org.apache.flink.table.operations.ExternalModifyOperation) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) Nullable(javax.annotation.Nullable) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) TimeCharacteristic(org.apache.flink.streaming.api.TimeCharacteristic) SchemaTranslator(org.apache.flink.table.catalog.SchemaTranslator) ModuleManager(org.apache.flink.table.module.ModuleManager) TableConfig(org.apache.flink.table.api.TableConfig) TableException(org.apache.flink.table.api.TableException) Table(org.apache.flink.table.api.Table) StreamExecutorFactory(org.apache.flink.table.delegation.StreamExecutorFactory) Preconditions(org.apache.flink.util.Preconditions) Collectors(java.util.stream.Collectors) FieldInfoUtils(org.apache.flink.table.typeutils.FieldInfoUtils) DataStream(org.apache.flink.streaming.api.datastream.DataStream) ExternalCatalogTable(org.apache.flink.table.catalog.ExternalCatalogTable) List(java.util.List) TypeExtractor(org.apache.flink.api.java.typeutils.TypeExtractor) FactoryUtil(org.apache.flink.table.factories.FactoryUtil) ValidationException(org.apache.flink.table.api.ValidationException) Executor(org.apache.flink.table.delegation.Executor) ApiExpressionUtils(org.apache.flink.table.expressions.ApiExpressionUtils) Optional(java.util.Optional) Internal(org.apache.flink.annotation.Internal) TypeConversions(org.apache.flink.table.types.utils.TypeConversions) Transformation(org.apache.flink.api.dag.Transformation) Collections(java.util.Collections) StreamExecutionEnvironment(org.apache.flink.streaming.api.environment.StreamExecutionEnvironment) ExternalModifyOperation(org.apache.flink.table.operations.ExternalModifyOperation) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) ExternalCatalogTable(org.apache.flink.table.catalog.ExternalCatalogTable) OperationTreeBuilder(org.apache.flink.table.operations.utils.OperationTreeBuilder) ApiExpressionUtils(org.apache.flink.table.expressions.ApiExpressionUtils) CatalogManager(org.apache.flink.table.catalog.CatalogManager) QueryOperation(org.apache.flink.table.operations.QueryOperation) DataStreamQueryOperation(org.apache.flink.table.operations.DataStreamQueryOperation) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation)

Example 14 with ResolvedCatalogTable

use of org.apache.flink.table.catalog.ResolvedCatalogTable in project flink by apache.

the class AbstractStreamTableEnvironmentImpl method fromStreamInternal.

protected <T> Table fromStreamInternal(DataStream<T> dataStream, @Nullable Schema schema, @Nullable String viewPath, ChangelogMode changelogMode) {
    Preconditions.checkNotNull(dataStream, "Data stream must not be null.");
    Preconditions.checkNotNull(changelogMode, "Changelog mode must not be null.");
    if (dataStream.getExecutionEnvironment() != executionEnvironment) {
        throw new ValidationException("The DataStream's StreamExecutionEnvironment must be identical to the one that " + "has been passed to the StreamTableEnvironment during instantiation.");
    }
    final CatalogManager catalogManager = getCatalogManager();
    final OperationTreeBuilder operationTreeBuilder = getOperationTreeBuilder();
    final SchemaTranslator.ConsumingResult schemaTranslationResult = SchemaTranslator.createConsumingResult(catalogManager.getDataTypeFactory(), dataStream.getType(), schema);
    final ResolvedCatalogTable resolvedCatalogTable = catalogManager.resolveCatalogTable(new ExternalCatalogTable(schemaTranslationResult.getSchema()));
    final ContextResolvedTable contextResolvedTable;
    if (viewPath != null) {
        UnresolvedIdentifier unresolvedIdentifier = getParser().parseIdentifier(viewPath);
        final ObjectIdentifier objectIdentifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
        contextResolvedTable = ContextResolvedTable.temporary(objectIdentifier, resolvedCatalogTable);
    } else {
        contextResolvedTable = ContextResolvedTable.anonymous("datastream_source", resolvedCatalogTable);
    }
    final QueryOperation scanOperation = new ExternalQueryOperation<>(contextResolvedTable, dataStream, schemaTranslationResult.getPhysicalDataType(), schemaTranslationResult.isTopLevelRecord(), changelogMode);
    final List<String> projections = schemaTranslationResult.getProjections();
    if (projections == null) {
        return createTable(scanOperation);
    }
    final QueryOperation projectOperation = operationTreeBuilder.project(projections.stream().map(ApiExpressionUtils::unresolvedRef).collect(Collectors.toList()), scanOperation);
    return createTable(projectOperation);
}
Also used : ValidationException(org.apache.flink.table.api.ValidationException) ExternalCatalogTable(org.apache.flink.table.catalog.ExternalCatalogTable) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) ApiExpressionUtils(org.apache.flink.table.expressions.ApiExpressionUtils) CatalogManager(org.apache.flink.table.catalog.CatalogManager) SchemaTranslator(org.apache.flink.table.catalog.SchemaTranslator) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) OperationTreeBuilder(org.apache.flink.table.operations.utils.OperationTreeBuilder) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) QueryOperation(org.apache.flink.table.operations.QueryOperation) DataStreamQueryOperation(org.apache.flink.table.operations.DataStreamQueryOperation) ExternalQueryOperation(org.apache.flink.table.operations.ExternalQueryOperation)

Example 15 with ResolvedCatalogTable

use of org.apache.flink.table.catalog.ResolvedCatalogTable in project flink by apache.

the class TableEnvironmentImpl method from.

@Override
public Table from(TableDescriptor descriptor) {
    Preconditions.checkNotNull(descriptor, "Table descriptor must not be null.");
    final ResolvedCatalogTable resolvedCatalogBaseTable = catalogManager.resolveCatalogTable(descriptor.toCatalogTable());
    final QueryOperation queryOperation = new SourceQueryOperation(ContextResolvedTable.anonymous(resolvedCatalogBaseTable));
    return createTable(queryOperation);
}
Also used : ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) TableSourceQueryOperation(org.apache.flink.table.operations.TableSourceQueryOperation) SourceQueryOperation(org.apache.flink.table.operations.SourceQueryOperation) QueryOperation(org.apache.flink.table.operations.QueryOperation) TableSourceQueryOperation(org.apache.flink.table.operations.TableSourceQueryOperation) SourceQueryOperation(org.apache.flink.table.operations.SourceQueryOperation)

Aggregations

ResolvedCatalogTable (org.apache.flink.table.catalog.ResolvedCatalogTable)23 ResolvedSchema (org.apache.flink.table.catalog.ResolvedSchema)11 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)8 HashMap (java.util.HashMap)7 ValidationException (org.apache.flink.table.api.ValidationException)5 CatalogTable (org.apache.flink.table.catalog.CatalogTable)5 List (java.util.List)4 CatalogManager (org.apache.flink.table.catalog.CatalogManager)4 ContextResolvedTable (org.apache.flink.table.catalog.ContextResolvedTable)4 ExternalCatalogTable (org.apache.flink.table.catalog.ExternalCatalogTable)4 QueryOperation (org.apache.flink.table.operations.QueryOperation)4 Test (org.junit.Test)4 Optional (java.util.Optional)3 RelDataType (org.apache.calcite.rel.type.RelDataType)3 SchemaTranslator (org.apache.flink.table.catalog.SchemaTranslator)3 JsonSerdeTestUtil.configuredSerdeContext (org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeTestUtil.configuredSerdeContext)3 Test (org.junit.jupiter.api.Test)3 ParameterizedTest (org.junit.jupiter.params.ParameterizedTest)3 Map (java.util.Map)2 Collectors (java.util.stream.Collectors)2