Search in sources :

Example 1 with CatalogDatabaseImpl

use of org.apache.flink.table.catalog.CatalogDatabaseImpl in project flink by apache.

the class HiveTableFactoryTest method testGenericTable.

@Test
public void testGenericTable() throws Exception {
    final TableSchema schema = TableSchema.builder().field("name", DataTypes.STRING()).field("age", DataTypes.INT()).build();
    catalog.createDatabase("mydb", new CatalogDatabaseImpl(new HashMap<>(), ""), true);
    final Map<String, String> options = Collections.singletonMap(FactoryUtil.CONNECTOR.key(), "COLLECTION");
    final CatalogTable table = new CatalogTableImpl(schema, options, "csv table");
    catalog.createTable(new ObjectPath("mydb", "mytable"), table, true);
    final Optional<TableFactory> tableFactoryOpt = catalog.getTableFactory();
    assertTrue(tableFactoryOpt.isPresent());
    final HiveTableFactory tableFactory = (HiveTableFactory) tableFactoryOpt.get();
    final TableSource tableSource = tableFactory.createTableSource(new TableSourceFactoryContextImpl(ObjectIdentifier.of("mycatalog", "mydb", "mytable"), table, new Configuration(), false));
    assertTrue(tableSource instanceof StreamTableSource);
    final TableSink tableSink = tableFactory.createTableSink(new TableSinkFactoryContextImpl(ObjectIdentifier.of("mycatalog", "mydb", "mytable"), table, new Configuration(), true, false));
    assertTrue(tableSink instanceof StreamTableSink);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) Configuration(org.apache.flink.configuration.Configuration) HashMap(java.util.HashMap) StreamTableSink(org.apache.flink.table.sinks.StreamTableSink) TableSink(org.apache.flink.table.sinks.TableSink) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) StreamTableSink(org.apache.flink.table.sinks.StreamTableSink) CatalogTable(org.apache.flink.table.catalog.CatalogTable) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) StreamTableSource(org.apache.flink.table.sources.StreamTableSource) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl) TableSinkFactoryContextImpl(org.apache.flink.table.factories.TableSinkFactoryContextImpl) TableSource(org.apache.flink.table.sources.TableSource) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) StreamTableSource(org.apache.flink.table.sources.StreamTableSource) TableFactory(org.apache.flink.table.factories.TableFactory) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) TableSourceFactoryContextImpl(org.apache.flink.table.factories.TableSourceFactoryContextImpl) Test(org.junit.Test)

Example 2 with CatalogDatabaseImpl

use of org.apache.flink.table.catalog.CatalogDatabaseImpl in project flink by apache.

the class HiveTableFactoryTest method testHiveTable.

@Test
public void testHiveTable() throws Exception {
    final ResolvedSchema schema = ResolvedSchema.of(Column.physical("name", DataTypes.STRING()), Column.physical("age", DataTypes.INT()));
    catalog.createDatabase("mydb", new CatalogDatabaseImpl(new HashMap<>(), ""), true);
    final Map<String, String> options = Collections.singletonMap(FactoryUtil.CONNECTOR.key(), SqlCreateHiveTable.IDENTIFIER);
    final CatalogTable table = new CatalogTableImpl(TableSchema.fromResolvedSchema(schema), options, "hive table");
    catalog.createTable(new ObjectPath("mydb", "mytable"), table, true);
    final DynamicTableSource tableSource = FactoryUtil.createDynamicTableSource((DynamicTableSourceFactory) catalog.getFactory().orElseThrow(IllegalStateException::new), ObjectIdentifier.of("mycatalog", "mydb", "mytable"), new ResolvedCatalogTable(table, schema), new Configuration(), Thread.currentThread().getContextClassLoader(), false);
    assertTrue(tableSource instanceof HiveTableSource);
    final DynamicTableSink tableSink = FactoryUtil.createDynamicTableSink((DynamicTableSinkFactory) catalog.getFactory().orElseThrow(IllegalStateException::new), ObjectIdentifier.of("mycatalog", "mydb", "mytable"), new ResolvedCatalogTable(table, schema), new Configuration(), Thread.currentThread().getContextClassLoader(), false);
    assertTrue(tableSink instanceof HiveTableSink);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) Configuration(org.apache.flink.configuration.Configuration) HashMap(java.util.HashMap) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) CatalogTable(org.apache.flink.table.catalog.CatalogTable) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) Test(org.junit.Test)

Example 3 with CatalogDatabaseImpl

use of org.apache.flink.table.catalog.CatalogDatabaseImpl in project flink by apache.

the class HiveParserDDLSemanticAnalyzer method convertAlterDatabaseOwner.

private Operation convertAlterDatabaseOwner(HiveParserASTNode ast) {
    String dbName = HiveParserBaseSemanticAnalyzer.getUnescapedName((HiveParserASTNode) ast.getChild(0));
    PrincipalDesc principalDesc = HiveParserAuthorizationParseUtils.getPrincipalDesc((HiveParserASTNode) ast.getChild(1));
    // The syntax should not allow these fields to be null, but lets verify
    String nullCmdMsg = "can't be null in alter database set owner command";
    if (principalDesc.getName() == null) {
        throw new ValidationException("Owner name " + nullCmdMsg);
    }
    if (principalDesc.getType() == null) {
        throw new ValidationException("Owner type " + nullCmdMsg);
    }
    CatalogDatabase originDB = getDatabase(dbName);
    Map<String, String> props = new HashMap<>(originDB.getProperties());
    props.put(ALTER_DATABASE_OP, SqlAlterHiveDatabase.AlterHiveDatabaseOp.CHANGE_OWNER.name());
    props.put(DATABASE_OWNER_NAME, principalDesc.getName());
    props.put(DATABASE_OWNER_TYPE, principalDesc.getType().name().toLowerCase());
    CatalogDatabase newDB = new CatalogDatabaseImpl(props, originDB.getComment());
    return new AlterDatabaseOperation(catalogManager.getCurrentCatalog(), dbName, newDB);
}
Also used : CatalogDatabase(org.apache.flink.table.catalog.CatalogDatabase) PrincipalDesc(org.apache.hadoop.hive.ql.plan.PrincipalDesc) AlterDatabaseOperation(org.apache.flink.table.operations.ddl.AlterDatabaseOperation) ValidationException(org.apache.flink.table.api.ValidationException) LinkedHashMap(java.util.LinkedHashMap) HashMap(java.util.HashMap) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl)

Example 4 with CatalogDatabaseImpl

use of org.apache.flink.table.catalog.CatalogDatabaseImpl in project flink by apache.

the class HiveCatalog method getDatabase.

// ------ databases ------
@Override
public CatalogDatabase getDatabase(String databaseName) throws DatabaseNotExistException, CatalogException {
    Database hiveDatabase = getHiveDatabase(databaseName);
    Map<String, String> properties = new HashMap<>(hiveDatabase.getParameters());
    properties.put(SqlCreateHiveDatabase.DATABASE_LOCATION_URI, hiveDatabase.getLocationUri());
    return new CatalogDatabaseImpl(properties, hiveDatabase.getDescription());
}
Also used : HashMap(java.util.HashMap) CatalogDatabase(org.apache.flink.table.catalog.CatalogDatabase) SqlAlterHiveDatabase(org.apache.flink.sql.parser.hive.ddl.SqlAlterHiveDatabase) SqlCreateHiveDatabase(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveDatabase) Database(org.apache.hadoop.hive.metastore.api.Database) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl)

Example 5 with CatalogDatabaseImpl

use of org.apache.flink.table.catalog.CatalogDatabaseImpl in project flink by apache.

the class SqlToOperationConverter method convertCreateDatabase.

/**
 * Convert CREATE DATABASE statement.
 */
private Operation convertCreateDatabase(SqlCreateDatabase sqlCreateDatabase) {
    String[] fullDatabaseName = sqlCreateDatabase.fullDatabaseName();
    if (fullDatabaseName.length > 2) {
        throw new ValidationException("create database identifier format error");
    }
    String catalogName = (fullDatabaseName.length == 1) ? catalogManager.getCurrentCatalog() : fullDatabaseName[0];
    String databaseName = (fullDatabaseName.length == 1) ? fullDatabaseName[0] : fullDatabaseName[1];
    boolean ignoreIfExists = sqlCreateDatabase.isIfNotExists();
    String databaseComment = sqlCreateDatabase.getComment().map(comment -> comment.getNlsString().getValue()).orElse(null);
    // set with properties
    Map<String, String> properties = new HashMap<>();
    sqlCreateDatabase.getPropertyList().getList().forEach(p -> properties.put(((SqlTableOption) p).getKeyString(), ((SqlTableOption) p).getValueString()));
    CatalogDatabase catalogDatabase = new CatalogDatabaseImpl(properties, databaseComment);
    return new CreateDatabaseOperation(catalogName, databaseName, catalogDatabase, ignoreIfExists);
}
Also used : SqlAlterTableReset(org.apache.flink.sql.parser.ddl.SqlAlterTableReset) ModifyOperation(org.apache.flink.table.operations.ModifyOperation) SqlShowCurrentCatalog(org.apache.flink.sql.parser.dql.SqlShowCurrentCatalog) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) SqlTableOption(org.apache.flink.sql.parser.ddl.SqlTableOption) FlinkPlannerImpl(org.apache.flink.table.planner.calcite.FlinkPlannerImpl) CatalogTable(org.apache.flink.table.catalog.CatalogTable) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) ShowCurrentDatabaseOperation(org.apache.flink.table.operations.ShowCurrentDatabaseOperation) SqlDropPartitions(org.apache.flink.sql.parser.ddl.SqlDropPartitions) SqlShowViews(org.apache.flink.sql.parser.dql.SqlShowViews) SqlAlterFunction(org.apache.flink.sql.parser.ddl.SqlAlterFunction) SqlEndStatementSet(org.apache.flink.sql.parser.dml.SqlEndStatementSet) SqlRichDescribeTable(org.apache.flink.sql.parser.dql.SqlRichDescribeTable) SqlShowPartitions(org.apache.flink.sql.parser.dql.SqlShowPartitions) Map(java.util.Map) SqlRemoveJar(org.apache.flink.sql.parser.ddl.SqlRemoveJar) FlinkHints(org.apache.flink.table.planner.hint.FlinkHints) CatalogPartitionImpl(org.apache.flink.table.catalog.CatalogPartitionImpl) SqlCreateTable(org.apache.flink.sql.parser.ddl.SqlCreateTable) SqlShowCreateTable(org.apache.flink.sql.parser.dql.SqlShowCreateTable) ExecutePlanOperation(org.apache.flink.table.operations.command.ExecutePlanOperation) DropDatabaseOperation(org.apache.flink.table.operations.ddl.DropDatabaseOperation) SqlUseModules(org.apache.flink.sql.parser.ddl.SqlUseModules) SqlUseDatabase(org.apache.flink.sql.parser.ddl.SqlUseDatabase) SinkModifyOperation(org.apache.flink.table.operations.SinkModifyOperation) SqlChangeColumn(org.apache.flink.sql.parser.ddl.SqlChangeColumn) ShowColumnsOperation(org.apache.flink.table.operations.ShowColumnsOperation) SqlKind(org.apache.calcite.sql.SqlKind) SqlAlterTable(org.apache.flink.sql.parser.ddl.SqlAlterTable) AlterTableOptionsOperation(org.apache.flink.table.operations.ddl.AlterTableOptionsOperation) AlterTableDropConstraintOperation(org.apache.flink.table.operations.ddl.AlterTableDropConstraintOperation) Set(java.util.Set) TableSchema(org.apache.flink.table.api.TableSchema) SqlAddReplaceColumns(org.apache.flink.sql.parser.ddl.SqlAddReplaceColumns) CompilePlanOperation(org.apache.flink.table.operations.ddl.CompilePlanOperation) CreateCatalogOperation(org.apache.flink.table.operations.ddl.CreateCatalogOperation) SqlDropDatabase(org.apache.flink.sql.parser.ddl.SqlDropDatabase) ShowCreateViewOperation(org.apache.flink.table.operations.ShowCreateViewOperation) OperationConverterUtils(org.apache.flink.table.planner.utils.OperationConverterUtils) UseCatalogOperation(org.apache.flink.table.operations.UseCatalogOperation) UseDatabaseOperation(org.apache.flink.table.operations.UseDatabaseOperation) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) FactoryUtil(org.apache.flink.table.factories.FactoryUtil) ShowCatalogsOperation(org.apache.flink.table.operations.ShowCatalogsOperation) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl) SqlAlterTableAddConstraint(org.apache.flink.sql.parser.ddl.SqlAlterTableAddConstraint) RichSqlInsert(org.apache.flink.sql.parser.dml.RichSqlInsert) CreateViewOperation(org.apache.flink.table.operations.ddl.CreateViewOperation) ShowJarsOperation(org.apache.flink.table.operations.command.ShowJarsOperation) AlterDatabaseOperation(org.apache.flink.table.operations.ddl.AlterDatabaseOperation) CatalogDatabase(org.apache.flink.table.catalog.CatalogDatabase) SqlAlterViewRename(org.apache.flink.sql.parser.ddl.SqlAlterViewRename) QueryOperation(org.apache.flink.table.operations.QueryOperation) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) SqlAlterDatabase(org.apache.flink.sql.parser.ddl.SqlAlterDatabase) CompileAndExecutePlanOperation(org.apache.flink.table.operations.CompileAndExecutePlanOperation) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) SqlUnloadModule(org.apache.flink.sql.parser.dql.SqlUnloadModule) DatabaseNotExistException(org.apache.flink.table.catalog.exceptions.DatabaseNotExistException) EndStatementSetOperation(org.apache.flink.table.operations.EndStatementSetOperation) ArrayList(java.util.ArrayList) LinkedHashMap(java.util.LinkedHashMap) SqlShowCreateView(org.apache.flink.sql.parser.dql.SqlShowCreateView) SqlAlterView(org.apache.flink.sql.parser.ddl.SqlAlterView) UseModulesOperation(org.apache.flink.table.operations.UseModulesOperation) CatalogView(org.apache.flink.table.catalog.CatalogView) Catalog(org.apache.flink.table.catalog.Catalog) SqlIdentifier(org.apache.calcite.sql.SqlIdentifier) DropCatalogFunctionOperation(org.apache.flink.table.operations.ddl.DropCatalogFunctionOperation) Expander(org.apache.flink.table.planner.utils.Expander) CalciteSqlDialect(org.apache.calcite.sql.dialect.CalciteSqlDialect) ShowTablesOperation(org.apache.flink.table.operations.ShowTablesOperation) SqlDropTable(org.apache.flink.sql.parser.ddl.SqlDropTable) DescribeTableOperation(org.apache.flink.table.operations.DescribeTableOperation) SqlLoadModule(org.apache.flink.sql.parser.dql.SqlLoadModule) ShowCurrentCatalogOperation(org.apache.flink.table.operations.ShowCurrentCatalogOperation) FunctionScope(org.apache.flink.table.operations.ShowFunctionsOperation.FunctionScope) SqlCreateFunction(org.apache.flink.sql.parser.ddl.SqlCreateFunction) SqlAddJar(org.apache.flink.sql.parser.ddl.SqlAddJar) SqlCreateDatabase(org.apache.flink.sql.parser.ddl.SqlCreateDatabase) TableException(org.apache.flink.table.api.TableException) SqlTableConstraint(org.apache.flink.sql.parser.ddl.constraint.SqlTableConstraint) SqlAddPartitions(org.apache.flink.sql.parser.ddl.SqlAddPartitions) ShowFunctionsOperation(org.apache.flink.table.operations.ShowFunctionsOperation) SqlCompileAndExecutePlan(org.apache.flink.sql.parser.dml.SqlCompileAndExecutePlan) CreateDatabaseOperation(org.apache.flink.table.operations.ddl.CreateDatabaseOperation) CatalogPartitionSpec(org.apache.flink.table.catalog.CatalogPartitionSpec) SqlShowColumns(org.apache.flink.sql.parser.dql.SqlShowColumns) SqlCreateView(org.apache.flink.sql.parser.ddl.SqlCreateView) SqlStatementSet(org.apache.flink.sql.parser.dml.SqlStatementSet) SqlShowCurrentDatabase(org.apache.flink.sql.parser.dql.SqlShowCurrentDatabase) AlterPartitionPropertiesOperation(org.apache.flink.table.operations.ddl.AlterPartitionPropertiesOperation) SqlParser(org.apache.calcite.sql.parser.SqlParser) SqlShowFunctions(org.apache.flink.sql.parser.dql.SqlShowFunctions) SqlAlterTableOptions(org.apache.flink.sql.parser.ddl.SqlAlterTableOptions) TableSchemaUtils(org.apache.flink.table.utils.TableSchemaUtils) ShowPartitionsOperation(org.apache.flink.table.operations.ShowPartitionsOperation) Schema(org.apache.flink.table.api.Schema) SqlUseCatalog(org.apache.flink.sql.parser.ddl.SqlUseCatalog) SqlExecutePlan(org.apache.flink.sql.parser.dml.SqlExecutePlan) SqlShowDatabases(org.apache.flink.sql.parser.dql.SqlShowDatabases) SqlDropView(org.apache.flink.sql.parser.ddl.SqlDropView) AlterViewPropertiesOperation(org.apache.flink.table.operations.ddl.AlterViewPropertiesOperation) SqlNode(org.apache.calcite.sql.SqlNode) SqlUtil(org.apache.calcite.sql.SqlUtil) SetOperation(org.apache.flink.table.operations.command.SetOperation) RelHint(org.apache.calcite.rel.hint.RelHint) SqlAlterViewAs(org.apache.flink.sql.parser.ddl.SqlAlterViewAs) SqlAlterTableRename(org.apache.flink.sql.parser.ddl.SqlAlterTableRename) ManagedTableListener(org.apache.flink.table.catalog.ManagedTableListener) SqlDropCatalog(org.apache.flink.sql.parser.ddl.SqlDropCatalog) SqlDropFunction(org.apache.flink.sql.parser.ddl.SqlDropFunction) SqlShowCatalogs(org.apache.flink.sql.parser.dql.SqlShowCatalogs) LoadModuleOperation(org.apache.flink.table.operations.LoadModuleOperation) Operation(org.apache.flink.table.operations.Operation) AlterCatalogFunctionOperation(org.apache.flink.table.operations.ddl.AlterCatalogFunctionOperation) SqlAlterTableDropConstraint(org.apache.flink.sql.parser.ddl.SqlAlterTableDropConstraint) SqlCompilePlan(org.apache.flink.sql.parser.ddl.SqlCompilePlan) DropTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.DropTempSystemFunctionOperation) ShowViewsOperation(org.apache.flink.table.operations.ShowViewsOperation) ShowDatabasesOperation(org.apache.flink.table.operations.ShowDatabasesOperation) SqlAlterTableCompact(org.apache.flink.sql.parser.ddl.SqlAlterTableCompact) FunctionLanguage(org.apache.flink.table.catalog.FunctionLanguage) StringUtils(org.apache.flink.util.StringUtils) Collectors(java.util.stream.Collectors) List(java.util.List) SqlShowModules(org.apache.flink.sql.parser.dql.SqlShowModules) ShowModulesOperation(org.apache.flink.table.operations.ShowModulesOperation) SqlRichExplain(org.apache.flink.sql.parser.dql.SqlRichExplain) SourceQueryOperation(org.apache.flink.table.operations.SourceQueryOperation) UnloadModuleOperation(org.apache.flink.table.operations.UnloadModuleOperation) ValidationException(org.apache.flink.table.api.ValidationException) DropTableOperation(org.apache.flink.table.operations.ddl.DropTableOperation) Optional(java.util.Optional) CatalogViewImpl(org.apache.flink.table.catalog.CatalogViewImpl) AlterViewAsOperation(org.apache.flink.table.operations.ddl.AlterViewAsOperation) RemoveJarOperation(org.apache.flink.table.operations.command.RemoveJarOperation) DropViewOperation(org.apache.flink.table.operations.ddl.DropViewOperation) SqlSet(org.apache.flink.sql.parser.ddl.SqlSet) CatalogManager(org.apache.flink.table.catalog.CatalogManager) SqlAlterViewProperties(org.apache.flink.sql.parser.ddl.SqlAlterViewProperties) BeginStatementSetOperation(org.apache.flink.table.operations.BeginStatementSetOperation) AddPartitionsOperation(org.apache.flink.table.operations.ddl.AddPartitionsOperation) RelRoot(org.apache.calcite.rel.RelRoot) HashMap(java.util.HashMap) AddJarOperation(org.apache.flink.table.operations.command.AddJarOperation) DropPartitionsOperation(org.apache.flink.table.operations.ddl.DropPartitionsOperation) AlterTableAddConstraintOperation(org.apache.flink.table.operations.ddl.AlterTableAddConstraintOperation) HashSet(java.util.HashSet) ExplainOperation(org.apache.flink.table.operations.ExplainOperation) ResetOperation(org.apache.flink.table.operations.command.ResetOperation) HintStrategyTable(org.apache.calcite.rel.hint.HintStrategyTable) SqlShowTables(org.apache.flink.sql.parser.dql.SqlShowTables) CatalogPartition(org.apache.flink.table.catalog.CatalogPartition) StatementSetOperation(org.apache.flink.table.operations.StatementSetOperation) SqlBeginStatementSet(org.apache.flink.sql.parser.dml.SqlBeginStatementSet) ResolvedCatalogTable(org.apache.flink.table.catalog.ResolvedCatalogTable) DropCatalogOperation(org.apache.flink.table.operations.ddl.DropCatalogOperation) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) SqlReset(org.apache.flink.sql.parser.ddl.SqlReset) SqlShowJars(org.apache.flink.sql.parser.dql.SqlShowJars) AlterTableRenameOperation(org.apache.flink.table.operations.ddl.AlterTableRenameOperation) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) SqlDialect(org.apache.calcite.sql.SqlDialect) SqlCreateCatalog(org.apache.flink.sql.parser.ddl.SqlCreateCatalog) SqlExecute(org.apache.flink.sql.parser.dml.SqlExecute) ShowCreateTableOperation(org.apache.flink.table.operations.ShowCreateTableOperation) AlterViewRenameOperation(org.apache.flink.table.operations.ddl.AlterViewRenameOperation) CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation) SqlNodeList(org.apache.calcite.sql.SqlNodeList) Collections(java.util.Collections) CatalogDatabase(org.apache.flink.table.catalog.CatalogDatabase) SqlTableOption(org.apache.flink.sql.parser.ddl.SqlTableOption) ValidationException(org.apache.flink.table.api.ValidationException) LinkedHashMap(java.util.LinkedHashMap) HashMap(java.util.HashMap) CreateDatabaseOperation(org.apache.flink.table.operations.ddl.CreateDatabaseOperation) CatalogDatabaseImpl(org.apache.flink.table.catalog.CatalogDatabaseImpl)

Aggregations

CatalogDatabaseImpl (org.apache.flink.table.catalog.CatalogDatabaseImpl)12 HashMap (java.util.HashMap)11 CatalogDatabase (org.apache.flink.table.catalog.CatalogDatabase)7 AlterDatabaseOperation (org.apache.flink.table.operations.ddl.AlterDatabaseOperation)6 LinkedHashMap (java.util.LinkedHashMap)5 ValidationException (org.apache.flink.table.api.ValidationException)5 CatalogTable (org.apache.flink.table.catalog.CatalogTable)5 Catalog (org.apache.flink.table.catalog.Catalog)4 Test (org.junit.Test)4 TableSchema (org.apache.flink.table.api.TableSchema)3 GenericInMemoryCatalog (org.apache.flink.table.catalog.GenericInMemoryCatalog)3 ObjectPath (org.apache.flink.table.catalog.ObjectPath)3 ResolvedCatalogTable (org.apache.flink.table.catalog.ResolvedCatalogTable)3 Configuration (org.apache.flink.configuration.Configuration)2 SqlCreateCatalog (org.apache.flink.sql.parser.ddl.SqlCreateCatalog)2 SqlDropCatalog (org.apache.flink.sql.parser.ddl.SqlDropCatalog)2 SqlTableOption (org.apache.flink.sql.parser.ddl.SqlTableOption)2 SqlUseCatalog (org.apache.flink.sql.parser.ddl.SqlUseCatalog)2 SqlShowCurrentCatalog (org.apache.flink.sql.parser.dql.SqlShowCurrentCatalog)2 Schema (org.apache.flink.table.api.Schema)2