Search in sources :

Example 41 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink by splunk.

the class TestValuesCatalog method listPartitionsByFilter.

@Override
public List<CatalogPartitionSpec> listPartitionsByFilter(ObjectPath tablePath, List<Expression> filters) throws TableNotExistException, TableNotPartitionedException, CatalogException {
    if (!supportListPartitionByFilter) {
        throw new UnsupportedOperationException("TestValuesCatalog doesn't support list partition by filters");
    }
    List<CatalogPartitionSpec> partitions = listPartitions(tablePath);
    if (partitions.isEmpty()) {
        return partitions;
    }
    CatalogBaseTable table = this.getTable(tablePath);
    TableSchema schema = table.getSchema();
    List<ResolvedExpression> resolvedExpressions = filters.stream().map(filter -> {
        if (filter instanceof ResolvedExpression) {
            return (ResolvedExpression) filter;
        }
        throw new UnsupportedOperationException(String.format("TestValuesCatalog only works with resolved expressions. Get unresolved expression: %s", filter));
    }).collect(Collectors.toList());
    return partitions.stream().filter(partition -> {
        Function<String, Comparable<?>> getter = getValueGetter(partition.getPartitionSpec(), schema);
        return FilterUtils.isRetainedAfterApplyingFilterPredicates(resolvedExpressions, getter);
    }).collect(Collectors.toList());
}
Also used : DataType(org.apache.flink.table.types.DataType) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) FilterUtils(org.apache.flink.table.planner.utils.FilterUtils) IntType(org.apache.flink.table.types.logical.IntType) TableException(org.apache.flink.table.api.TableException) TableSchema(org.apache.flink.table.api.TableSchema) VarCharType(org.apache.flink.table.types.logical.VarCharType) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) Expression(org.apache.flink.table.expressions.Expression) ObjectPath(org.apache.flink.table.catalog.ObjectPath) Function(java.util.function.Function) Collectors(java.util.stream.Collectors) CharType(org.apache.flink.table.types.logical.CharType) CatalogPartitionSpec(org.apache.flink.table.catalog.CatalogPartitionSpec) List(java.util.List) TableNotPartitionedException(org.apache.flink.table.catalog.exceptions.TableNotPartitionedException) DoubleType(org.apache.flink.table.types.logical.DoubleType) LogicalType(org.apache.flink.table.types.logical.LogicalType) BooleanType(org.apache.flink.table.types.logical.BooleanType) ResolvedExpression(org.apache.flink.table.expressions.ResolvedExpression) Map(java.util.Map) Optional(java.util.Optional) GenericInMemoryCatalog(org.apache.flink.table.catalog.GenericInMemoryCatalog) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Function(java.util.function.Function) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) TableSchema(org.apache.flink.table.api.TableSchema) ResolvedExpression(org.apache.flink.table.expressions.ResolvedExpression) CatalogPartitionSpec(org.apache.flink.table.catalog.CatalogPartitionSpec)

Example 42 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink by splunk.

the class TableEnvironmentImpl method registerTableSourceInternal.

@Override
public void registerTableSourceInternal(String name, TableSource<?> tableSource) {
    validateTableSource(tableSource);
    ObjectIdentifier objectIdentifier = catalogManager.qualifyIdentifier(UnresolvedIdentifier.of(name));
    Optional<CatalogBaseTable> table = getTemporaryTable(objectIdentifier);
    if (table.isPresent()) {
        if (table.get() instanceof ConnectorCatalogTable<?, ?>) {
            ConnectorCatalogTable<?, ?> sourceSinkTable = (ConnectorCatalogTable<?, ?>) table.get();
            if (sourceSinkTable.getTableSource().isPresent()) {
                throw new ValidationException(String.format("Table '%s' already exists. Please choose a different name.", name));
            } else {
                // wrapper contains only sink (not source)
                ConnectorCatalogTable sourceAndSink = ConnectorCatalogTable.sourceAndSink(tableSource, sourceSinkTable.getTableSink().get(), !IS_STREAM_TABLE);
                catalogManager.dropTemporaryTable(objectIdentifier, false);
                catalogManager.createTemporaryTable(sourceAndSink, objectIdentifier, false);
            }
        } else {
            throw new ValidationException(String.format("Table '%s' already exists. Please choose a different name.", name));
        }
    } else {
        ConnectorCatalogTable source = ConnectorCatalogTable.source(tableSource, !IS_STREAM_TABLE);
        catalogManager.createTemporaryTable(source, objectIdentifier, false);
    }
}
Also used : CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) ValidationException(org.apache.flink.table.api.ValidationException) ConnectorCatalogTable(org.apache.flink.table.catalog.ConnectorCatalogTable) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 43 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink by splunk.

the class TableEnvironmentImpl method createTemporaryView.

private void createTemporaryView(UnresolvedIdentifier identifier, Table view) {
    if (((TableImpl) view).getTableEnvironment() != this) {
        throw new TableException("Only table API objects that belong to this TableEnvironment can be registered.");
    }
    ObjectIdentifier tableIdentifier = catalogManager.qualifyIdentifier(identifier);
    QueryOperation queryOperation = qualifyQueryOperation(tableIdentifier, view.getQueryOperation());
    CatalogBaseTable tableTable = new QueryOperationCatalogView(queryOperation);
    catalogManager.createTemporaryTable(tableTable, tableIdentifier, false);
}
Also used : TableException(org.apache.flink.table.api.TableException) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) QueryOperationCatalogView(org.apache.flink.table.catalog.QueryOperationCatalogView) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) QueryOperation(org.apache.flink.table.operations.QueryOperation) TableSourceQueryOperation(org.apache.flink.table.operations.TableSourceQueryOperation) SourceQueryOperation(org.apache.flink.table.operations.SourceQueryOperation)

Example 44 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink by splunk.

the class TableEnvironmentTest method innerTestManagedTableFromDescriptor.

private void innerTestManagedTableFromDescriptor(boolean ignoreIfExists, boolean isTemporary) {
    final TableEnvironmentMock tEnv = TableEnvironmentMock.getStreamingInstance();
    final String catalog = tEnv.getCurrentCatalog();
    final String database = tEnv.getCurrentDatabase();
    final Schema schema = Schema.newBuilder().column("f0", DataTypes.INT()).build();
    final String tableName = UUID.randomUUID().toString();
    ObjectIdentifier identifier = ObjectIdentifier.of(catalog, database, tableName);
    // create table
    MANAGED_TABLES.put(identifier, new AtomicReference<>());
    CreateTableOperation createOperation = new CreateTableOperation(identifier, TableDescriptor.forManaged().schema(schema).option("a", "Test").build().toCatalogTable(), ignoreIfExists, isTemporary);
    tEnv.executeInternal(createOperation);
    // test ignore: create again
    if (ignoreIfExists) {
        tEnv.executeInternal(createOperation);
    } else {
        assertThatThrownBy(() -> tEnv.executeInternal(createOperation), isTemporary ? "already exists" : "Could not execute CreateTable");
    }
    // lookup table
    boolean isInCatalog = tEnv.getCatalog(catalog).orElseThrow(AssertionError::new).tableExists(new ObjectPath(database, tableName));
    if (isTemporary) {
        assertThat(isInCatalog).isFalse();
    } else {
        assertThat(isInCatalog).isTrue();
    }
    final Optional<ContextResolvedTable> lookupResult = tEnv.getCatalogManager().getTable(identifier);
    assertThat(lookupResult.isPresent()).isTrue();
    final CatalogBaseTable catalogTable = lookupResult.get().getTable();
    assertThat(catalogTable instanceof CatalogTable).isTrue();
    assertThat(catalogTable.getUnresolvedSchema()).isEqualTo(schema);
    assertThat(catalogTable.getOptions().get("a")).isEqualTo("Test");
    assertThat(catalogTable.getOptions().get(ENRICHED_KEY)).isEqualTo(ENRICHED_VALUE);
    AtomicReference<Map<String, String>> reference = MANAGED_TABLES.get(identifier);
    assertThat(reference.get()).isNotNull();
    assertThat(reference.get().get("a")).isEqualTo("Test");
    assertThat(reference.get().get(ENRICHED_KEY)).isEqualTo(ENRICHED_VALUE);
    DropTableOperation dropOperation = new DropTableOperation(identifier, ignoreIfExists, isTemporary);
    tEnv.executeInternal(dropOperation);
    assertThat(MANAGED_TABLES.get(identifier).get()).isNull();
    // test ignore: drop again
    if (ignoreIfExists) {
        tEnv.executeInternal(dropOperation);
    } else {
        assertThatThrownBy(() -> tEnv.executeInternal(dropOperation), "does not exist");
    }
    MANAGED_TABLES.remove(identifier);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) CreateTableOperation(org.apache.flink.table.operations.ddl.CreateTableOperation) CatalogTable(org.apache.flink.table.catalog.CatalogTable) DropTableOperation(org.apache.flink.table.operations.ddl.DropTableOperation) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) TableEnvironmentMock(org.apache.flink.table.utils.TableEnvironmentMock) Map(java.util.Map) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 45 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink by splunk.

the class SqlToOperationConverter method convertAlterView.

/**
 * convert ALTER VIEW statement.
 */
private Operation convertAlterView(SqlAlterView alterView) {
    UnresolvedIdentifier unresolvedIdentifier = UnresolvedIdentifier.of(alterView.fullViewName());
    ObjectIdentifier viewIdentifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
    Optional<ContextResolvedTable> optionalCatalogTable = catalogManager.getTable(viewIdentifier);
    if (!optionalCatalogTable.isPresent() || optionalCatalogTable.get().isTemporary()) {
        throw new ValidationException(String.format("View %s doesn't exist or is a temporary view.", viewIdentifier.toString()));
    }
    CatalogBaseTable baseTable = optionalCatalogTable.get().getTable();
    if (baseTable instanceof CatalogTable) {
        throw new ValidationException("ALTER VIEW for a table is not allowed");
    }
    if (alterView instanceof SqlAlterViewRename) {
        UnresolvedIdentifier newUnresolvedIdentifier = UnresolvedIdentifier.of(((SqlAlterViewRename) alterView).fullNewViewName());
        ObjectIdentifier newTableIdentifier = catalogManager.qualifyIdentifier(newUnresolvedIdentifier);
        return new AlterViewRenameOperation(viewIdentifier, newTableIdentifier);
    } else if (alterView instanceof SqlAlterViewProperties) {
        SqlAlterViewProperties alterViewProperties = (SqlAlterViewProperties) alterView;
        CatalogView oldView = (CatalogView) baseTable;
        Map<String, String> newProperties = new HashMap<>(oldView.getOptions());
        newProperties.putAll(OperationConverterUtils.extractProperties(alterViewProperties.getPropertyList()));
        CatalogView newView = new CatalogViewImpl(oldView.getOriginalQuery(), oldView.getExpandedQuery(), oldView.getSchema(), newProperties, oldView.getComment());
        return new AlterViewPropertiesOperation(viewIdentifier, newView);
    } else if (alterView instanceof SqlAlterViewAs) {
        SqlAlterViewAs alterViewAs = (SqlAlterViewAs) alterView;
        final SqlNode newQuery = alterViewAs.getNewQuery();
        CatalogView oldView = (CatalogView) baseTable;
        CatalogView newView = convertViewQuery(newQuery, Collections.emptyList(), oldView.getOptions(), oldView.getComment());
        return new AlterViewAsOperation(viewIdentifier, newView);
    } else {
        throw new ValidationException(String.format("[%s] needs to implement", alterView.toSqlString(CalciteSqlDialect.DEFAULT)));
    }
}
Also used : AlterViewPropertiesOperation(org.apache.flink.table.operations.ddl.AlterViewPropertiesOperation) AlterViewAsOperation(org.apache.flink.table.operations.ddl.AlterViewAsOperation) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) ValidationException(org.apache.flink.table.api.ValidationException) CatalogViewImpl(org.apache.flink.table.catalog.CatalogViewImpl) SqlAlterViewRename(org.apache.flink.sql.parser.ddl.SqlAlterViewRename) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) CatalogTable(org.apache.flink.table.catalog.CatalogTable) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) AlterViewRenameOperation(org.apache.flink.table.operations.ddl.AlterViewRenameOperation) SqlAlterViewAs(org.apache.flink.sql.parser.ddl.SqlAlterViewAs) SqlAlterViewProperties(org.apache.flink.sql.parser.ddl.SqlAlterViewProperties) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) CatalogView(org.apache.flink.table.catalog.CatalogView) Map(java.util.Map) LinkedHashMap(java.util.LinkedHashMap) HashMap(java.util.HashMap) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) SqlNode(org.apache.calcite.sql.SqlNode)

Aggregations

CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)106 ObjectPath (org.apache.flink.table.catalog.ObjectPath)52 CatalogTable (org.apache.flink.table.catalog.CatalogTable)46 Test (org.junit.Test)42 ValidationException (org.apache.flink.table.api.ValidationException)33 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)30 CatalogView (org.apache.flink.table.catalog.CatalogView)27 TableSchema (org.apache.flink.table.api.TableSchema)24 Table (org.apache.hadoop.hive.metastore.api.Table)21 HashMap (java.util.HashMap)19 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)18 UniqueConstraint (org.apache.flink.table.api.constraints.UniqueConstraint)15 ContextResolvedTable (org.apache.flink.table.catalog.ContextResolvedTable)15 Map (java.util.Map)13 LinkedHashMap (java.util.LinkedHashMap)12 CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)12 AlterViewAsOperation (org.apache.flink.table.operations.ddl.AlterViewAsOperation)12 DropTableOperation (org.apache.flink.table.operations.ddl.DropTableOperation)12 ArrayList (java.util.ArrayList)9 CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)9