Search in sources :

Example 6 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class HiveCatalogTest method testCreateGenericTable.

@Test
public void testCreateGenericTable() {
    Table hiveTable = HiveTableUtil.instantiateHiveTable(new ObjectPath("test", "test"), new CatalogTableImpl(schema, getLegacyFileSystemConnectorOptions("/test_path"), null), HiveTestUtils.createHiveConf(), false);
    Map<String, String> prop = hiveTable.getParameters();
    assertThat(HiveCatalog.isHiveTable(prop)).isFalse();
    assertThat(prop.keySet()).allMatch(k -> k.startsWith(CatalogPropertiesUtil.FLINK_PROPERTY_PREFIX));
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) Test(org.junit.Test)

Example 7 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class HiveCatalogTest method testAlterFlinkNonManagedTableToFlinkManagedTable.

@Test
public void testAlterFlinkNonManagedTableToFlinkManagedTable() throws Exception {
    Map<String, String> originOptions = Collections.singletonMap(FactoryUtil.CONNECTOR.key(), DataGenTableSourceFactory.IDENTIFIER);
    CatalogTable originTable = new CatalogTableImpl(schema, originOptions, "Flink non-managed table");
    hiveCatalog.createTable(tablePath, originTable, false);
    Map<String, String> newOptions = Collections.emptyMap();
    CatalogTable newTable = new CatalogTableImpl(schema, newOptions, "Flink managed table");
    assertThatThrownBy(() -> hiveCatalog.alterTable(tablePath, newTable, false)).isInstanceOf(IllegalArgumentException.class).hasMessageContaining("Changing catalog table type is not allowed. " + "Existing table type is 'FLINK_NON_MANAGED_TABLE', but new table type is 'FLINK_MANAGED_TABLE'");
}
Also used : CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogTable(org.apache.flink.table.catalog.CatalogTable) Test(org.junit.Test)

Example 8 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class HiveCatalogTest method testAlterFlinkManagedTableToFlinkManagedTable.

@Test
public void testAlterFlinkManagedTableToFlinkManagedTable() throws Exception {
    Map<String, String> originOptions = Collections.emptyMap();
    CatalogTable originTable = new CatalogTableImpl(schema, originOptions, "Flink managed table");
    hiveCatalog.createTable(tablePath, originTable, false);
    Map<String, String> newOptions = Collections.singletonMap(FactoryUtil.CONNECTOR.key(), DataGenTableSourceFactory.IDENTIFIER);
    CatalogTable newTable = new CatalogTableImpl(schema, newOptions, "Flink non-managed table");
    assertThatThrownBy(() -> hiveCatalog.alterTable(tablePath, newTable, false)).isInstanceOf(IllegalArgumentException.class).hasMessageContaining("Changing catalog table type is not allowed. " + "Existing table type is 'FLINK_MANAGED_TABLE', but new table type is 'FLINK_NON_MANAGED_TABLE'");
}
Also used : CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogTable(org.apache.flink.table.catalog.CatalogTable) Test(org.junit.Test)

Example 9 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class HiveCatalogTest method testCreateAndGetFlinkManagedTable.

@Test
public void testCreateAndGetFlinkManagedTable() throws Exception {
    CatalogTable table = new CatalogTableImpl(schema, Collections.emptyMap(), "Flink managed table");
    hiveCatalog.createTable(tablePath, table, false);
    Table hiveTable = hiveCatalog.getHiveTable(tablePath);
    assertThat(hiveTable.getParameters()).containsEntry(FLINK_PROPERTY_PREFIX + CONNECTOR.key(), ManagedTableFactory.DEFAULT_IDENTIFIER);
    CatalogBaseTable retrievedTable = hiveCatalog.instantiateCatalogTable(hiveTable);
    assertThat(retrievedTable.getOptions()).isEmpty();
}
Also used : CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogTable(org.apache.flink.table.catalog.CatalogTable) Test(org.junit.Test)

Example 10 with CatalogTableImpl

use of org.apache.flink.table.catalog.CatalogTableImpl in project flink by apache.

the class HiveCatalogTest method testAlterFlinkManagedTableToHiveTable.

@Test
public void testAlterFlinkManagedTableToHiveTable() throws Exception {
    Map<String, String> originOptions = Collections.emptyMap();
    CatalogTable originTable = new CatalogTableImpl(schema, originOptions, "Flink managed table");
    hiveCatalog.createTable(tablePath, originTable, false);
    Map<String, String> newOptions = getLegacyFileSystemConnectorOptions("/test_path");
    newOptions.put(FactoryUtil.CONNECTOR.key(), SqlCreateHiveTable.IDENTIFIER);
    CatalogTable newTable = new CatalogTableImpl(schema, newOptions, "Hive table");
    assertThatThrownBy(() -> hiveCatalog.alterTable(tablePath, newTable, false)).isInstanceOf(IllegalArgumentException.class).hasMessageContaining("Changing catalog table type is not allowed. " + "Existing table type is 'FLINK_MANAGED_TABLE', but new table type is 'HIVE_TABLE'");
}
Also used : CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogTable(org.apache.flink.table.catalog.CatalogTable) Test(org.junit.Test)

Aggregations

CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)39 CatalogTable (org.apache.flink.table.catalog.CatalogTable)26 Test (org.junit.Test)24 TableSchema (org.apache.flink.table.api.TableSchema)21 HashMap (java.util.HashMap)20 ObjectPath (org.apache.flink.table.catalog.ObjectPath)19 CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)7 Configuration (org.apache.flink.configuration.Configuration)6 LinkedHashMap (java.util.LinkedHashMap)5 ValidationException (org.apache.flink.table.api.ValidationException)5 UniqueConstraint (org.apache.flink.table.api.constraints.UniqueConstraint)5 AlterTableSchemaOperation (org.apache.flink.table.operations.ddl.AlterTableSchemaOperation)5 TableColumn (org.apache.flink.table.api.TableColumn)4 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)4 Table (org.apache.hadoop.hive.metastore.api.Table)4 ArrayList (java.util.ArrayList)3 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)3 ResolvedCatalogTable (org.apache.flink.table.catalog.ResolvedCatalogTable)3 IOException (java.io.IOException)2 Path (java.nio.file.Path)2