Search in sources :

Example 76 with ObjectPath

use of org.apache.flink.table.catalog.ObjectPath in project flink by apache.

the class JavaCatalogTableTest method testResolvingSchemaOfCustomCatalogTableSql.

@Test
public void testResolvingSchemaOfCustomCatalogTableSql() throws Exception {
    TableTestUtil testUtil = getTestUtil();
    TableEnvironment tableEnvironment = testUtil.getTableEnv();
    GenericInMemoryCatalog genericInMemoryCatalog = new GenericInMemoryCatalog("in-memory");
    genericInMemoryCatalog.createTable(new ObjectPath("default", "testTable"), new CustomCatalogTable(isStreamingMode), false);
    tableEnvironment.registerCatalog("testCatalog", genericInMemoryCatalog);
    tableEnvironment.executeSql("CREATE VIEW testTable2 AS SELECT * FROM testCatalog.`default`.testTable");
    testUtil.verifyExecPlan("SELECT COUNT(*) FROM testTable2 GROUP BY TUMBLE(rowtime, INTERVAL '10' MINUTE)");
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableTestUtil(org.apache.flink.table.planner.utils.TableTestUtil) TableEnvironment(org.apache.flink.table.api.TableEnvironment) GenericInMemoryCatalog(org.apache.flink.table.catalog.GenericInMemoryCatalog) Test(org.junit.Test)

Example 77 with ObjectPath

use of org.apache.flink.table.catalog.ObjectPath in project flink by apache.

the class CatalogConstraintTest method testWithoutPrimaryKey.

@Test
public void testWithoutPrimaryKey() throws Exception {
    TableSchema tableSchema = TableSchema.builder().fields(new String[] { "a", "b", "c" }, new DataType[] { DataTypes.BIGINT(), DataTypes.STRING(), DataTypes.INT() }).build();
    Map<String, String> properties = buildCatalogTableProperties(tableSchema);
    catalog.createTable(new ObjectPath(databaseName, "T1"), new CatalogTableImpl(tableSchema, properties, ""), false);
    RelNode t1 = TableTestUtil.toRelNode(tEnv.sqlQuery("select * from T1"));
    FlinkRelMetadataQuery mq = FlinkRelMetadataQuery.reuseOrCreate(t1.getCluster().getMetadataQuery());
    assertEquals(ImmutableSet.of(), mq.getUniqueKeys(t1));
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) RelNode(org.apache.calcite.rel.RelNode) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) DataType(org.apache.flink.table.types.DataType) FlinkRelMetadataQuery(org.apache.flink.table.planner.plan.metadata.FlinkRelMetadataQuery) Test(org.junit.Test)

Example 78 with ObjectPath

use of org.apache.flink.table.catalog.ObjectPath in project flink by apache.

the class CatalogConstraintTest method testWithPrimaryKey.

@Test
public void testWithPrimaryKey() throws Exception {
    TableSchema tableSchema = TableSchema.builder().fields(new String[] { "a", "b", "c" }, new DataType[] { DataTypes.STRING(), DataTypes.BIGINT().notNull(), DataTypes.INT() }).primaryKey("b").build();
    Map<String, String> properties = buildCatalogTableProperties(tableSchema);
    catalog.createTable(new ObjectPath(databaseName, "T1"), new CatalogTableImpl(tableSchema, properties, ""), false);
    RelNode t1 = TableTestUtil.toRelNode(tEnv.sqlQuery("select * from T1"));
    FlinkRelMetadataQuery mq = FlinkRelMetadataQuery.reuseOrCreate(t1.getCluster().getMetadataQuery());
    assertEquals(ImmutableSet.of(ImmutableBitSet.of(1)), mq.getUniqueKeys(t1));
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) RelNode(org.apache.calcite.rel.RelNode) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) FlinkRelMetadataQuery(org.apache.flink.table.planner.plan.metadata.FlinkRelMetadataQuery) Test(org.junit.Test)

Example 79 with ObjectPath

use of org.apache.flink.table.catalog.ObjectPath in project flink by apache.

the class CatalogStatisticsTest method testGetStatsFromCatalogForCatalogTableImpl.

@Test
public void testGetStatsFromCatalogForCatalogTableImpl() throws Exception {
    Map<String, String> properties = new HashMap<>();
    properties.put("connector.type", "filesystem");
    properties.put("connector.property-version", "1");
    properties.put("connector.path", "/path/to/csv");
    properties.put("format.type", "csv");
    properties.put("format.property-version", "1");
    properties.put("format.field-delimiter", ";");
    catalog.createTable(new ObjectPath(databaseName, "T1"), new CatalogTableImpl(tableSchema, properties, ""), false);
    catalog.createTable(new ObjectPath(databaseName, "T2"), new CatalogTableImpl(tableSchema, properties, ""), false);
    alterTableStatistics(catalog, "T1");
    assertStatistics(tEnv, "T1");
    alterTableStatisticsWithUnknownRowCount(catalog, "T2");
    assertTableStatisticsWithUnknownRowCount(tEnv, "T2");
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) HashMap(java.util.HashMap) LinkedHashMap(java.util.LinkedHashMap) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogColumnStatisticsDataString(org.apache.flink.table.catalog.stats.CatalogColumnStatisticsDataString) Test(org.junit.Test)

Example 80 with ObjectPath

use of org.apache.flink.table.catalog.ObjectPath in project flink by apache.

the class CatalogStatisticsTest method alterTableStatistics.

private void alterTableStatistics(Catalog catalog, String tableName) throws TableNotExistException, TablePartitionedException {
    catalog.alterTableStatistics(new ObjectPath(databaseName, tableName), new CatalogTableStatistics(100, 10, 1000L, 2000L), true);
    catalog.alterTableColumnStatistics(new ObjectPath(databaseName, tableName), createColumnStats(), true);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) CatalogTableStatistics(org.apache.flink.table.catalog.stats.CatalogTableStatistics)

Aggregations

ObjectPath (org.apache.flink.table.catalog.ObjectPath)81 Test (org.junit.Test)52 CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)32 CatalogTable (org.apache.flink.table.catalog.CatalogTable)29 HashMap (java.util.HashMap)21 CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)20 TableSchema (org.apache.flink.table.api.TableSchema)19 TableEnvironment (org.apache.flink.table.api.TableEnvironment)17 CatalogPartitionSpec (org.apache.flink.table.catalog.CatalogPartitionSpec)12 Table (org.apache.hadoop.hive.metastore.api.Table)12 Configuration (org.apache.flink.configuration.Configuration)11 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)11 TableNotExistException (org.apache.flink.table.catalog.exceptions.TableNotExistException)9 ArrayList (java.util.ArrayList)8 Map (java.util.Map)8 GenericInMemoryCatalog (org.apache.flink.table.catalog.GenericInMemoryCatalog)8 LinkedHashMap (java.util.LinkedHashMap)7 Catalog (org.apache.flink.table.catalog.Catalog)7 ContextResolvedTable (org.apache.flink.table.catalog.ContextResolvedTable)6 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)6