Search in sources :

Example 66 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink by apache.

the class TestValuesCatalog method listPartitionsByFilter.

@Override
public List<CatalogPartitionSpec> listPartitionsByFilter(ObjectPath tablePath, List<Expression> filters) throws TableNotExistException, TableNotPartitionedException, CatalogException {
    if (!supportListPartitionByFilter) {
        throw new UnsupportedOperationException("TestValuesCatalog doesn't support list partition by filters");
    }
    List<CatalogPartitionSpec> partitions = listPartitions(tablePath);
    if (partitions.isEmpty()) {
        return partitions;
    }
    CatalogBaseTable table = this.getTable(tablePath);
    TableSchema schema = table.getSchema();
    List<ResolvedExpression> resolvedExpressions = filters.stream().map(filter -> {
        if (filter instanceof ResolvedExpression) {
            return (ResolvedExpression) filter;
        }
        throw new UnsupportedOperationException(String.format("TestValuesCatalog only works with resolved expressions. Get unresolved expression: %s", filter));
    }).collect(Collectors.toList());
    return partitions.stream().filter(partition -> {
        Function<String, Comparable<?>> getter = getValueGetter(partition.getPartitionSpec(), schema);
        return FilterUtils.isRetainedAfterApplyingFilterPredicates(resolvedExpressions, getter);
    }).collect(Collectors.toList());
}
Also used : DataType(org.apache.flink.table.types.DataType) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) FilterUtils(org.apache.flink.table.planner.utils.FilterUtils) IntType(org.apache.flink.table.types.logical.IntType) TableException(org.apache.flink.table.api.TableException) TableSchema(org.apache.flink.table.api.TableSchema) VarCharType(org.apache.flink.table.types.logical.VarCharType) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) Expression(org.apache.flink.table.expressions.Expression) ObjectPath(org.apache.flink.table.catalog.ObjectPath) Function(java.util.function.Function) Collectors(java.util.stream.Collectors) CharType(org.apache.flink.table.types.logical.CharType) CatalogPartitionSpec(org.apache.flink.table.catalog.CatalogPartitionSpec) List(java.util.List) TableNotPartitionedException(org.apache.flink.table.catalog.exceptions.TableNotPartitionedException) DoubleType(org.apache.flink.table.types.logical.DoubleType) LogicalType(org.apache.flink.table.types.logical.LogicalType) BooleanType(org.apache.flink.table.types.logical.BooleanType) ResolvedExpression(org.apache.flink.table.expressions.ResolvedExpression) Map(java.util.Map) Optional(java.util.Optional) GenericInMemoryCatalog(org.apache.flink.table.catalog.GenericInMemoryCatalog) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Function(java.util.function.Function) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) TableSchema(org.apache.flink.table.api.TableSchema) ResolvedExpression(org.apache.flink.table.expressions.ResolvedExpression) CatalogPartitionSpec(org.apache.flink.table.catalog.CatalogPartitionSpec)

Example 67 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink-mirror by flink-ci.

the class HiveParserDDLSemanticAnalyzer method convertAlterView.

private Operation convertAlterView(HiveParserASTNode ast) throws SemanticException {
    Operation operation = null;
    String[] qualified = HiveParserBaseSemanticAnalyzer.getQualifiedTableName((HiveParserASTNode) ast.getChild(0));
    String tableName = HiveParserBaseSemanticAnalyzer.getDotName(qualified);
    CatalogBaseTable alteredTable = getAlteredTable(tableName, true);
    if (ast.getChild(1).getType() == HiveASTParser.TOK_QUERY) {
        // alter view as
        operation = convertCreateView(ast);
    } else {
        ast = (HiveParserASTNode) ast.getChild(1);
        switch(ast.getType()) {
            case HiveASTParser.TOK_ALTERVIEW_PROPERTIES:
                operation = convertAlterTableProps(alteredTable, tableName, null, ast, true, false);
                break;
            case HiveASTParser.TOK_ALTERVIEW_DROPPROPERTIES:
                operation = convertAlterTableProps(alteredTable, tableName, null, ast, true, true);
                break;
            case HiveASTParser.TOK_ALTERVIEW_RENAME:
                operation = convertAlterTableRename(tableName, ast, true);
                break;
            case HiveASTParser.TOK_ALTERVIEW_ADDPARTS:
            case HiveASTParser.TOK_ALTERVIEW_DROPPARTS:
                handleUnsupportedOperation("ADD/DROP PARTITION for view is not supported");
                break;
            default:
                throw new ValidationException("Unknown AST node for ALTER VIEW: " + ast);
        }
    }
    return operation;
}
Also used : CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) ValidationException(org.apache.flink.table.api.ValidationException) DropDatabaseOperation(org.apache.flink.table.operations.ddl.DropDatabaseOperation) AlterTableOptionsOperation(org.apache.flink.table.operations.ddl.AlterTableOptionsOperation) UseDatabaseOperation(org.apache.flink.table.operations.UseDatabaseOperation) CreateViewOperation(org.apache.flink.table.operations.ddl.CreateViewOperation) AlterDatabaseOperation(org.apache.flink.table.operations.ddl.AlterDatabaseOperation) HiveOperation(org.apache.hadoop.hive.ql.plan.HiveOperation) QueryOperation(org.apache.flink.table.operations.QueryOperation) DropCatalogFunctionOperation(org.apache.flink.table.operations.ddl.DropCatalogFunctionOperation) ShowTablesOperation(org.apache.flink.table.operations.ShowTablesOperation) DescribeTableOperation(org.apache.flink.table.operations.DescribeTableOperation) ShowFunctionsOperation(org.apache.flink.table.operations.ShowFunctionsOperation) CreateDatabaseOperation(org.apache.flink.table.operations.ddl.CreateDatabaseOperation) AlterPartitionPropertiesOperation(org.apache.flink.table.operations.ddl.AlterPartitionPropertiesOperation) ShowPartitionsOperation(org.apache.flink.table.operations.ShowPartitionsOperation) AlterViewPropertiesOperation(org.apache.flink.table.operations.ddl.AlterViewPropertiesOperation) Operation(org.apache.flink.table.operations.Operation) DropTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.DropTempSystemFunctionOperation) ShowViewsOperation(org.apache.flink.table.operations.ShowViewsOperation) ShowDatabasesOperation(org.apache.flink.table.operations.ShowDatabasesOperation) AlterTableSchemaOperation(org.apache.flink.table.operations.ddl.AlterTableSchemaOperation) CreateTableASOperation(org.apache.flink.table.operations.ddl.CreateTableASOperation) DropTableOperation(org.apache.flink.table.operations.ddl.DropTableOperation) AlterViewAsOperation(org.apache.flink.table.operations.ddl.AlterViewAsOperation) CreateTableOperation(org.apache.flink.table.operations.ddl.CreateTableOperation) DropViewOperation(org.apache.flink.table.operations.ddl.DropViewOperation) AddPartitionsOperation(org.apache.flink.table.operations.ddl.AddPartitionsOperation) DropPartitionsOperation(org.apache.flink.table.operations.ddl.DropPartitionsOperation) AlterTableRenameOperation(org.apache.flink.table.operations.ddl.AlterTableRenameOperation) AlterViewRenameOperation(org.apache.flink.table.operations.ddl.AlterViewRenameOperation) CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation)

Example 68 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink-mirror by flink-ci.

the class HiveParserDDLSemanticAnalyzer method convertDropTable.

private Operation convertDropTable(HiveParserASTNode ast, TableType expectedType) {
    String tableName = HiveParserBaseSemanticAnalyzer.getUnescapedName((HiveParserASTNode) ast.getChild(0));
    boolean ifExists = (ast.getFirstChildWithType(HiveASTParser.TOK_IFEXISTS) != null);
    ObjectIdentifier identifier = parseObjectIdentifier(tableName);
    CatalogBaseTable baseTable = getCatalogBaseTable(identifier, true);
    if (expectedType == TableType.VIRTUAL_VIEW) {
        if (baseTable instanceof CatalogTable) {
            throw new ValidationException("DROP VIEW for a table is not allowed");
        }
        return new DropViewOperation(identifier, ifExists, false);
    } else {
        if (baseTable instanceof CatalogView) {
            throw new ValidationException("DROP TABLE for a view is not allowed");
        }
        return new DropTableOperation(identifier, ifExists, false);
    }
}
Also used : CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) ValidationException(org.apache.flink.table.api.ValidationException) DropViewOperation(org.apache.flink.table.operations.ddl.DropViewOperation) DropTableOperation(org.apache.flink.table.operations.ddl.DropTableOperation) CatalogTable(org.apache.flink.table.catalog.CatalogTable) CatalogView(org.apache.flink.table.catalog.CatalogView) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 69 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink-mirror by flink-ci.

the class HiveParserDDLSemanticAnalyzer method convertAlterTable.

private Operation convertAlterTable(HiveParserASTNode input) throws SemanticException {
    Operation operation = null;
    HiveParserASTNode ast = (HiveParserASTNode) input.getChild(1);
    String[] qualified = HiveParserBaseSemanticAnalyzer.getQualifiedTableName((HiveParserASTNode) input.getChild(0));
    String tableName = HiveParserBaseSemanticAnalyzer.getDotName(qualified);
    HashMap<String, String> partSpec = null;
    HiveParserASTNode partSpecNode = (HiveParserASTNode) input.getChild(2);
    if (partSpecNode != null) {
        partSpec = getPartSpec(partSpecNode);
    }
    CatalogBaseTable alteredTable = getAlteredTable(tableName, false);
    switch(ast.getType()) {
        case HiveASTParser.TOK_ALTERTABLE_RENAME:
            operation = convertAlterTableRename(tableName, ast, false);
            break;
        case HiveASTParser.TOK_ALTERTABLE_ADDCOLS:
            operation = convertAlterTableModifyCols(alteredTable, tableName, ast, false);
            break;
        case HiveASTParser.TOK_ALTERTABLE_REPLACECOLS:
            operation = convertAlterTableModifyCols(alteredTable, tableName, ast, true);
            break;
        case HiveASTParser.TOK_ALTERTABLE_RENAMECOL:
            operation = convertAlterTableChangeCol(alteredTable, qualified, ast);
            break;
        case HiveASTParser.TOK_ALTERTABLE_ADDPARTS:
            operation = convertAlterTableAddParts(qualified, ast);
            break;
        case HiveASTParser.TOK_ALTERTABLE_DROPPARTS:
            operation = convertAlterTableDropParts(qualified, ast);
            break;
        case HiveASTParser.TOK_ALTERTABLE_PROPERTIES:
            operation = convertAlterTableProps(alteredTable, tableName, null, ast, false, false);
            break;
        case HiveASTParser.TOK_ALTERTABLE_DROPPROPERTIES:
            operation = convertAlterTableProps(alteredTable, tableName, null, ast, false, true);
            break;
        case HiveASTParser.TOK_ALTERTABLE_UPDATESTATS:
            operation = convertAlterTableProps(alteredTable, tableName, partSpec, ast, false, false);
            break;
        case HiveASTParser.TOK_ALTERTABLE_FILEFORMAT:
            operation = convertAlterTableFileFormat(alteredTable, ast, tableName, partSpec);
            break;
        case HiveASTParser.TOK_ALTERTABLE_LOCATION:
            operation = convertAlterTableLocation(alteredTable, ast, tableName, partSpec);
            break;
        case HiveASTParser.TOK_ALTERTABLE_SERIALIZER:
            operation = convertAlterTableSerde(alteredTable, ast, tableName, partSpec);
            break;
        case HiveASTParser.TOK_ALTERTABLE_SERDEPROPERTIES:
            operation = convertAlterTableSerdeProps(alteredTable, ast, tableName, partSpec);
            break;
        case HiveASTParser.TOK_ALTERTABLE_TOUCH:
        case HiveASTParser.TOK_ALTERTABLE_ARCHIVE:
        case HiveASTParser.TOK_ALTERTABLE_UNARCHIVE:
        case HiveASTParser.TOK_ALTERTABLE_PARTCOLTYPE:
        case HiveASTParser.TOK_ALTERTABLE_SKEWED:
        case HiveASTParser.TOK_ALTERTABLE_EXCHANGEPARTITION:
        case HiveASTParser.TOK_ALTERTABLE_MERGEFILES:
        case HiveASTParser.TOK_ALTERTABLE_RENAMEPART:
        case HiveASTParser.TOK_ALTERTABLE_SKEWED_LOCATION:
        case HiveASTParser.TOK_ALTERTABLE_BUCKETS:
        case HiveASTParser.TOK_ALTERTABLE_CLUSTER_SORT:
        case HiveASTParser.TOK_ALTERTABLE_COMPACT:
        case HiveASTParser.TOK_ALTERTABLE_UPDATECOLSTATS:
        case HiveASTParser.TOK_ALTERTABLE_DROPCONSTRAINT:
        case HiveASTParser.TOK_ALTERTABLE_ADDCONSTRAINT:
            handleUnsupportedOperation(ast);
            break;
        default:
            throw new ValidationException("Unknown AST node for ALTER TABLE: " + ast);
    }
    return operation;
}
Also used : CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) HiveParserASTNode(org.apache.flink.table.planner.delegation.hive.copy.HiveParserASTNode) ValidationException(org.apache.flink.table.api.ValidationException) DropDatabaseOperation(org.apache.flink.table.operations.ddl.DropDatabaseOperation) AlterTableOptionsOperation(org.apache.flink.table.operations.ddl.AlterTableOptionsOperation) UseDatabaseOperation(org.apache.flink.table.operations.UseDatabaseOperation) CreateViewOperation(org.apache.flink.table.operations.ddl.CreateViewOperation) AlterDatabaseOperation(org.apache.flink.table.operations.ddl.AlterDatabaseOperation) HiveOperation(org.apache.hadoop.hive.ql.plan.HiveOperation) QueryOperation(org.apache.flink.table.operations.QueryOperation) DropCatalogFunctionOperation(org.apache.flink.table.operations.ddl.DropCatalogFunctionOperation) ShowTablesOperation(org.apache.flink.table.operations.ShowTablesOperation) DescribeTableOperation(org.apache.flink.table.operations.DescribeTableOperation) ShowFunctionsOperation(org.apache.flink.table.operations.ShowFunctionsOperation) CreateDatabaseOperation(org.apache.flink.table.operations.ddl.CreateDatabaseOperation) AlterPartitionPropertiesOperation(org.apache.flink.table.operations.ddl.AlterPartitionPropertiesOperation) ShowPartitionsOperation(org.apache.flink.table.operations.ShowPartitionsOperation) AlterViewPropertiesOperation(org.apache.flink.table.operations.ddl.AlterViewPropertiesOperation) Operation(org.apache.flink.table.operations.Operation) DropTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.DropTempSystemFunctionOperation) ShowViewsOperation(org.apache.flink.table.operations.ShowViewsOperation) ShowDatabasesOperation(org.apache.flink.table.operations.ShowDatabasesOperation) AlterTableSchemaOperation(org.apache.flink.table.operations.ddl.AlterTableSchemaOperation) CreateTableASOperation(org.apache.flink.table.operations.ddl.CreateTableASOperation) DropTableOperation(org.apache.flink.table.operations.ddl.DropTableOperation) AlterViewAsOperation(org.apache.flink.table.operations.ddl.AlterViewAsOperation) CreateTableOperation(org.apache.flink.table.operations.ddl.CreateTableOperation) DropViewOperation(org.apache.flink.table.operations.ddl.DropViewOperation) AddPartitionsOperation(org.apache.flink.table.operations.ddl.AddPartitionsOperation) DropPartitionsOperation(org.apache.flink.table.operations.ddl.DropPartitionsOperation) AlterTableRenameOperation(org.apache.flink.table.operations.ddl.AlterTableRenameOperation) AlterViewRenameOperation(org.apache.flink.table.operations.ddl.AlterViewRenameOperation) CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation)

Example 70 with CatalogBaseTable

use of org.apache.flink.table.catalog.CatalogBaseTable in project flink-mirror by flink-ci.

the class HiveCatalog method alterTable.

@Override
public void alterTable(ObjectPath tablePath, CatalogBaseTable newCatalogTable, boolean ignoreIfNotExists) throws TableNotExistException, CatalogException {
    checkNotNull(tablePath, "tablePath cannot be null");
    checkNotNull(newCatalogTable, "newCatalogTable cannot be null");
    Table hiveTable;
    try {
        hiveTable = getHiveTable(tablePath);
    } catch (TableNotExistException e) {
        if (!ignoreIfNotExists) {
            throw e;
        }
        return;
    }
    CatalogBaseTable existingTable = instantiateCatalogTable(hiveTable);
    if (existingTable.getTableKind() != newCatalogTable.getTableKind()) {
        throw new CatalogException(String.format("Table types don't match. Existing table is '%s' and new table is '%s'.", existingTable.getTableKind(), newCatalogTable.getTableKind()));
    }
    disallowChangeCatalogTableType(existingTable.getOptions(), newCatalogTable.getOptions());
    boolean isHiveTable = isHiveTable(hiveTable.getParameters());
    if (isHiveTable) {
        AlterTableOp op = HiveTableUtil.extractAlterTableOp(newCatalogTable.getOptions());
        if (op == null) {
            // the alter operation isn't encoded as properties
            hiveTable = HiveTableUtil.alterTableViaCatalogBaseTable(tablePath, newCatalogTable, hiveTable, hiveConf, false);
        } else {
            alterTableViaProperties(op, hiveTable, (CatalogTable) newCatalogTable, hiveTable.getParameters(), newCatalogTable.getOptions(), hiveTable.getSd());
        }
    } else {
        hiveTable = HiveTableUtil.alterTableViaCatalogBaseTable(tablePath, newCatalogTable, hiveTable, hiveConf, ManagedTableListener.isManagedTable(this, newCatalogTable));
    }
    if (isHiveTable) {
        hiveTable.getParameters().remove(CONNECTOR.key());
    }
    try {
        client.alter_table(tablePath.getDatabaseName(), tablePath.getObjectName(), hiveTable);
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to alter table %s", tablePath.getFullName()), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) AlterTableOp(org.apache.flink.sql.parser.hive.ddl.SqlAlterHiveTable.AlterTableOp) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException)

Aggregations

CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)106 ObjectPath (org.apache.flink.table.catalog.ObjectPath)52 CatalogTable (org.apache.flink.table.catalog.CatalogTable)46 Test (org.junit.Test)42 ValidationException (org.apache.flink.table.api.ValidationException)33 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)30 CatalogView (org.apache.flink.table.catalog.CatalogView)27 TableSchema (org.apache.flink.table.api.TableSchema)24 Table (org.apache.hadoop.hive.metastore.api.Table)21 HashMap (java.util.HashMap)19 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)18 UniqueConstraint (org.apache.flink.table.api.constraints.UniqueConstraint)15 ContextResolvedTable (org.apache.flink.table.catalog.ContextResolvedTable)15 Map (java.util.Map)13 LinkedHashMap (java.util.LinkedHashMap)12 CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)12 AlterViewAsOperation (org.apache.flink.table.operations.ddl.AlterViewAsOperation)12 DropTableOperation (org.apache.flink.table.operations.ddl.DropTableOperation)12 ArrayList (java.util.ArrayList)9 CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)9