Search in sources :

Example 36 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class SqlCreateTableConverter method createCatalogTable.

private CatalogTable createCatalogTable(SqlCreateTable sqlCreateTable) {
    final TableSchema sourceTableSchema;
    final List<String> sourcePartitionKeys;
    final List<SqlTableLike.SqlTableLikeOption> likeOptions;
    final Map<String, String> sourceProperties;
    if (sqlCreateTable.getTableLike().isPresent()) {
        SqlTableLike sqlTableLike = sqlCreateTable.getTableLike().get();
        CatalogTable table = lookupLikeSourceTable(sqlTableLike);
        sourceTableSchema = TableSchema.fromResolvedSchema(table.getUnresolvedSchema().resolve(catalogManager.getSchemaResolver()));
        sourcePartitionKeys = table.getPartitionKeys();
        likeOptions = sqlTableLike.getOptions();
        sourceProperties = table.getOptions();
    } else {
        sourceTableSchema = TableSchema.builder().build();
        sourcePartitionKeys = Collections.emptyList();
        likeOptions = Collections.emptyList();
        sourceProperties = Collections.emptyMap();
    }
    Map<SqlTableLike.FeatureOption, SqlTableLike.MergingStrategy> mergingStrategies = mergeTableLikeUtil.computeMergingStrategies(likeOptions);
    Map<String, String> mergedOptions = mergeOptions(sqlCreateTable, sourceProperties, mergingStrategies);
    Optional<SqlTableConstraint> primaryKey = sqlCreateTable.getFullConstraints().stream().filter(SqlTableConstraint::isPrimaryKey).findAny();
    TableSchema mergedSchema = mergeTableLikeUtil.mergeTables(mergingStrategies, sourceTableSchema, sqlCreateTable.getColumnList().getList(), sqlCreateTable.getWatermark().map(Collections::singletonList).orElseGet(Collections::emptyList), primaryKey.orElse(null));
    List<String> partitionKeys = mergePartitions(sourcePartitionKeys, sqlCreateTable.getPartitionKeyList(), mergingStrategies);
    verifyPartitioningColumnsExist(mergedSchema, partitionKeys);
    String tableComment = sqlCreateTable.getComment().map(comment -> comment.getNlsString().getValue()).orElse(null);
    return new CatalogTableImpl(mergedSchema, partitionKeys, mergedOptions, tableComment);
}
Also used : CatalogManager(org.apache.flink.table.catalog.CatalogManager) Arrays(java.util.Arrays) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) SqlTableOption(org.apache.flink.sql.parser.ddl.SqlTableOption) CatalogTable(org.apache.flink.table.catalog.CatalogTable) HashMap(java.util.HashMap) Function(java.util.function.Function) SqlNode(org.apache.calcite.sql.SqlNode) Map(java.util.Map) SqlIdentifier(org.apache.calcite.sql.SqlIdentifier) SqlCreateTable(org.apache.flink.sql.parser.ddl.SqlCreateTable) ContextResolvedTable(org.apache.flink.table.catalog.ContextResolvedTable) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) Operation(org.apache.flink.table.operations.Operation) SqlTableConstraint(org.apache.flink.sql.parser.ddl.constraint.SqlTableConstraint) TableSchema(org.apache.flink.table.api.TableSchema) Collectors(java.util.stream.Collectors) Consumer(java.util.function.Consumer) SqlTableLike(org.apache.flink.sql.parser.ddl.SqlTableLike) List(java.util.List) ValidationException(org.apache.flink.table.api.ValidationException) FlinkCalciteSqlValidator(org.apache.flink.table.planner.calcite.FlinkCalciteSqlValidator) Optional(java.util.Optional) CreateTableOperation(org.apache.flink.table.operations.ddl.CreateTableOperation) SqlNodeList(org.apache.calcite.sql.SqlNodeList) Collections(java.util.Collections) TableSchema(org.apache.flink.table.api.TableSchema) CatalogTable(org.apache.flink.table.catalog.CatalogTable) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) SqlTableLike(org.apache.flink.sql.parser.ddl.SqlTableLike) SqlTableConstraint(org.apache.flink.sql.parser.ddl.constraint.SqlTableConstraint) Collections(java.util.Collections)

Example 37 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class CatalogConstraintTest method testWithoutPrimaryKey.

@Test
public void testWithoutPrimaryKey() throws Exception {
    TableSchema tableSchema = TableSchema.builder().fields(new String[] { "a", "b", "c" }, new DataType[] { DataTypes.BIGINT(), DataTypes.STRING(), DataTypes.INT() }).build();
    Map<String, String> properties = buildCatalogTableProperties(tableSchema);
    catalog.createTable(new ObjectPath(databaseName, "T1"), new CatalogTableImpl(tableSchema, properties, ""), false);
    RelNode t1 = TableTestUtil.toRelNode(tEnv.sqlQuery("select * from T1"));
    FlinkRelMetadataQuery mq = FlinkRelMetadataQuery.reuseOrCreate(t1.getCluster().getMetadataQuery());
    assertEquals(ImmutableSet.of(), mq.getUniqueKeys(t1));
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) RelNode(org.apache.calcite.rel.RelNode) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) DataType(org.apache.flink.table.types.DataType) FlinkRelMetadataQuery(org.apache.flink.table.planner.plan.metadata.FlinkRelMetadataQuery) Test(org.junit.Test)

Example 38 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class CatalogConstraintTest method testWithPrimaryKey.

@Test
public void testWithPrimaryKey() throws Exception {
    TableSchema tableSchema = TableSchema.builder().fields(new String[] { "a", "b", "c" }, new DataType[] { DataTypes.STRING(), DataTypes.BIGINT().notNull(), DataTypes.INT() }).primaryKey("b").build();
    Map<String, String> properties = buildCatalogTableProperties(tableSchema);
    catalog.createTable(new ObjectPath(databaseName, "T1"), new CatalogTableImpl(tableSchema, properties, ""), false);
    RelNode t1 = TableTestUtil.toRelNode(tEnv.sqlQuery("select * from T1"));
    FlinkRelMetadataQuery mq = FlinkRelMetadataQuery.reuseOrCreate(t1.getCluster().getMetadataQuery());
    assertEquals(ImmutableSet.of(ImmutableBitSet.of(1)), mq.getUniqueKeys(t1));
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) RelNode(org.apache.calcite.rel.RelNode) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) FlinkRelMetadataQuery(org.apache.flink.table.planner.plan.metadata.FlinkRelMetadataQuery) Test(org.junit.Test)

Example 39 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class CatalogStatisticsTest method testGetStatsFromCatalogForCatalogTableImpl.

@Test
public void testGetStatsFromCatalogForCatalogTableImpl() throws Exception {
    Map<String, String> properties = new HashMap<>();
    properties.put("connector.type", "filesystem");
    properties.put("connector.property-version", "1");
    properties.put("connector.path", "/path/to/csv");
    properties.put("format.type", "csv");
    properties.put("format.property-version", "1");
    properties.put("format.field-delimiter", ";");
    catalog.createTable(new ObjectPath(databaseName, "T1"), new CatalogTableImpl(tableSchema, properties, ""), false);
    catalog.createTable(new ObjectPath(databaseName, "T2"), new CatalogTableImpl(tableSchema, properties, ""), false);
    alterTableStatistics(catalog, "T1");
    assertStatistics(tEnv, "T1");
    alterTableStatisticsWithUnknownRowCount(catalog, "T2");
    assertTableStatisticsWithUnknownRowCount(tEnv, "T2");
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) HashMap(java.util.HashMap) LinkedHashMap(java.util.LinkedHashMap) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogColumnStatisticsDataString(org.apache.flink.table.catalog.stats.CatalogColumnStatisticsDataString) Test(org.junit.Test)

Aggregations

CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)39 CatalogTable (org.apache.flink.table.catalog.CatalogTable)26 Test (org.junit.Test)24 TableSchema (org.apache.flink.table.api.TableSchema)21 HashMap (java.util.HashMap)20 ObjectPath (org.apache.flink.table.catalog.ObjectPath)19 CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)7 Configuration (org.apache.flink.configuration.Configuration)6 LinkedHashMap (java.util.LinkedHashMap)5 ValidationException (org.apache.flink.table.api.ValidationException)5 UniqueConstraint (org.apache.flink.table.api.constraints.UniqueConstraint)5 AlterTableSchemaOperation (org.apache.flink.table.operations.ddl.AlterTableSchemaOperation)5 TableColumn (org.apache.flink.table.api.TableColumn)4 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)4 Table (org.apache.hadoop.hive.metastore.api.Table)4 ArrayList (java.util.ArrayList)3 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)3 ResolvedCatalogTable (org.apache.flink.table.catalog.ResolvedCatalogTable)3 IOException (java.io.IOException)2 Path (java.nio.file.Path)2