Search in sources :

Example 6 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class HiveCatalog method alterFunction.

@Override
public void alterFunction(ObjectPath functionPath, CatalogFunction newFunction, boolean ignoreIfNotExists) throws FunctionNotExistException, CatalogException {
    checkNotNull(functionPath, "functionPath cannot be null");
    checkNotNull(newFunction, "newFunction cannot be null");
    try {
        // check if function exists
        getFunction(functionPath);
        Function hiveFunction;
        if (newFunction instanceof CatalogFunctionImpl) {
            hiveFunction = instantiateHiveFunction(functionPath, newFunction);
        } else {
            throw new CatalogException(String.format("Unsupported catalog function type %s", newFunction.getClass().getName()));
        }
        client.alterFunction(functionPath.getDatabaseName(), functionPath.getObjectName(), hiveFunction);
    } catch (FunctionNotExistException e) {
        if (!ignoreIfNotExists) {
            throw e;
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to alter function %s", functionPath.getFullName()), e);
    }
}
Also used : FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) TException(org.apache.thrift.TException) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Function(org.apache.hadoop.hive.metastore.api.Function) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl)

Example 7 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class HiveParserDDLSemanticAnalyzer method convertCreateFunction.

private Operation convertCreateFunction(HiveParserASTNode ast) {
    // ^(TOK_CREATEFUNCTION identifier StringLiteral ({isTempFunction}? => TOK_TEMPORARY))
    String functionName = ast.getChild(0).getText().toLowerCase();
    boolean isTemporaryFunction = (ast.getFirstChildWithType(HiveASTParser.TOK_TEMPORARY) != null);
    String className = HiveParserBaseSemanticAnalyzer.unescapeSQLString(ast.getChild(1).getText());
    // Temp functions are not allowed to have qualified names.
    if (isTemporaryFunction && FunctionUtils.isQualifiedFunctionName(functionName)) {
        // belong to a catalog/db
        throw new ValidationException("Temporary function cannot be created with a qualified name.");
    }
    if (isTemporaryFunction) {
        FunctionDefinition funcDefinition = funcDefFactory.createFunctionDefinition(functionName, new CatalogFunctionImpl(className, FunctionLanguage.JAVA));
        return new CreateTempSystemFunctionOperation(functionName, false, funcDefinition);
    } else {
        ObjectIdentifier identifier = parseObjectIdentifier(functionName);
        CatalogFunction catalogFunction = new CatalogFunctionImpl(className, FunctionLanguage.JAVA);
        return new CreateCatalogFunctionOperation(identifier, catalogFunction, false, false);
    }
}
Also used : CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) ValidationException(org.apache.flink.table.api.ValidationException) FunctionDefinition(org.apache.flink.table.functions.FunctionDefinition) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 8 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class HiveCatalogGenericMetadataTest method testFunctionWithNonExistClass.

// ------ functions ------
@Test
public void testFunctionWithNonExistClass() throws Exception {
    // to make sure hive catalog doesn't check function class
    catalog.createDatabase(db1, createDb(), false);
    CatalogFunction catalogFunction = new CatalogFunctionImpl("non.exist.scala.class", FunctionLanguage.SCALA);
    catalog.createFunction(path1, catalogFunction, false);
    assertEquals(catalogFunction.getClassName(), catalog.getFunction(path1).getClassName());
    assertEquals(catalogFunction.getFunctionLanguage(), catalog.getFunction(path1).getFunctionLanguage());
    // alter the function
    catalogFunction = new CatalogFunctionImpl("non.exist.java.class", FunctionLanguage.JAVA);
    catalog.alterFunction(path1, catalogFunction, false);
    assertEquals(catalogFunction.getClassName(), catalog.getFunction(path1).getClassName());
    assertEquals(catalogFunction.getFunctionLanguage(), catalog.getFunction(path1).getFunctionLanguage());
    catalogFunction = new CatalogFunctionImpl("non.exist.python.class", FunctionLanguage.PYTHON);
    catalog.alterFunction(path1, catalogFunction, false);
    assertEquals(catalogFunction.getClassName(), catalog.getFunction(path1).getClassName());
    assertEquals(catalogFunction.getFunctionLanguage(), catalog.getFunction(path1).getFunctionLanguage());
}
Also used : CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) Test(org.junit.Test)

Example 9 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class SqlToOperationConverterTest method testCreateTableWithWatermark.

@Test
public void testCreateTableWithWatermark() throws FunctionAlreadyExistException, DatabaseNotExistException {
    CatalogFunction cf = new CatalogFunctionImpl(JavaUserDefinedScalarFunctions.JavaFunc5.class.getName());
    catalog.createFunction(ObjectPath.fromString("default.myfunc"), cf, true);
    final String sql = "create table source_table(\n" + "  a int,\n" + "  b bigint,\n" + "  c timestamp(3),\n" + "  watermark for `c` as myfunc(c, 1) - interval '5' second\n" + ") with (\n" + "  'connector.type' = 'kafka')\n";
    final FlinkPlannerImpl planner = getPlannerBySqlDialect(SqlDialect.DEFAULT);
    final CalciteParser parser = getParserBySqlDialect(SqlDialect.DEFAULT);
    SqlNode node = parser.parse(sql);
    assertThat(node).isInstanceOf(SqlCreateTable.class);
    Operation operation = SqlToOperationConverter.convert(planner, catalogManager, node).get();
    assertThat(operation).isInstanceOf(CreateTableOperation.class);
    CreateTableOperation op = (CreateTableOperation) operation;
    CatalogTable catalogTable = op.getCatalogTable();
    Map<String, String> properties = catalogTable.toProperties();
    Map<String, String> expected = new HashMap<>();
    expected.put("schema.0.name", "a");
    expected.put("schema.0.data-type", "INT");
    expected.put("schema.1.name", "b");
    expected.put("schema.1.data-type", "BIGINT");
    expected.put("schema.2.name", "c");
    expected.put("schema.2.data-type", "TIMESTAMP(3)");
    expected.put("schema.watermark.0.rowtime", "c");
    expected.put("schema.watermark.0.strategy.expr", "`builtin`.`default`.`myfunc`(`c`, 1) - INTERVAL '5' SECOND");
    expected.put("schema.watermark.0.strategy.data-type", "TIMESTAMP(3)");
    expected.put("connector.type", "kafka");
    assertThat(properties).isEqualTo(expected);
}
Also used : HashMap(java.util.HashMap) FlinkPlannerImpl(org.apache.flink.table.planner.calcite.FlinkPlannerImpl) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) OperationMatchers.isCreateTableOperation(org.apache.flink.table.planner.utils.OperationMatchers.isCreateTableOperation) DropDatabaseOperation(org.apache.flink.table.operations.ddl.DropDatabaseOperation) SinkModifyOperation(org.apache.flink.table.operations.SinkModifyOperation) AlterTableOptionsOperation(org.apache.flink.table.operations.ddl.AlterTableOptionsOperation) AlterTableDropConstraintOperation(org.apache.flink.table.operations.ddl.AlterTableDropConstraintOperation) UseCatalogOperation(org.apache.flink.table.operations.UseCatalogOperation) UseDatabaseOperation(org.apache.flink.table.operations.UseDatabaseOperation) CreateViewOperation(org.apache.flink.table.operations.ddl.CreateViewOperation) ShowJarsOperation(org.apache.flink.table.operations.command.ShowJarsOperation) AlterDatabaseOperation(org.apache.flink.table.operations.ddl.AlterDatabaseOperation) QueryOperation(org.apache.flink.table.operations.QueryOperation) EndStatementSetOperation(org.apache.flink.table.operations.EndStatementSetOperation) UseModulesOperation(org.apache.flink.table.operations.UseModulesOperation) ShowFunctionsOperation(org.apache.flink.table.operations.ShowFunctionsOperation) CreateDatabaseOperation(org.apache.flink.table.operations.ddl.CreateDatabaseOperation) SetOperation(org.apache.flink.table.operations.command.SetOperation) LoadModuleOperation(org.apache.flink.table.operations.LoadModuleOperation) Operation(org.apache.flink.table.operations.Operation) ShowModulesOperation(org.apache.flink.table.operations.ShowModulesOperation) SourceQueryOperation(org.apache.flink.table.operations.SourceQueryOperation) UnloadModuleOperation(org.apache.flink.table.operations.UnloadModuleOperation) CreateTableOperation(org.apache.flink.table.operations.ddl.CreateTableOperation) RemoveJarOperation(org.apache.flink.table.operations.command.RemoveJarOperation) BeginStatementSetOperation(org.apache.flink.table.operations.BeginStatementSetOperation) AddJarOperation(org.apache.flink.table.operations.command.AddJarOperation) AlterTableAddConstraintOperation(org.apache.flink.table.operations.ddl.AlterTableAddConstraintOperation) ExplainOperation(org.apache.flink.table.operations.ExplainOperation) ResetOperation(org.apache.flink.table.operations.command.ResetOperation) StatementSetOperation(org.apache.flink.table.operations.StatementSetOperation) AlterTableRenameOperation(org.apache.flink.table.operations.ddl.AlterTableRenameOperation) OperationMatchers.isCreateTableOperation(org.apache.flink.table.planner.utils.OperationMatchers.isCreateTableOperation) CreateTableOperation(org.apache.flink.table.operations.ddl.CreateTableOperation) CatalogTable(org.apache.flink.table.catalog.CatalogTable) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) CalciteParser(org.apache.flink.table.planner.parse.CalciteParser) SqlNode(org.apache.calcite.sql.SqlNode) Test(org.junit.Test)

Example 10 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class FunctionITCase method testAlterFunction.

@Test
public void testAlterFunction() throws Exception {
    String create = "create function f3 as 'org.apache.flink.function.TestFunction'";
    String alter = "alter function f3 as 'org.apache.flink.function.TestFunction2'";
    ObjectPath objectPath = new ObjectPath("default_database", "f3");
    assertTrue(tEnv().getCatalog("default_catalog").isPresent());
    Catalog catalog = tEnv().getCatalog("default_catalog").get();
    tEnv().executeSql(create);
    CatalogFunction beforeUpdate = catalog.getFunction(objectPath);
    assertEquals("org.apache.flink.function.TestFunction", beforeUpdate.getClassName());
    tEnv().executeSql(alter);
    CatalogFunction afterUpdate = catalog.getFunction(objectPath);
    assertEquals("org.apache.flink.function.TestFunction2", afterUpdate.getClassName());
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) CoreMatchers.containsString(org.hamcrest.CoreMatchers.containsString) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Catalog(org.apache.flink.table.catalog.Catalog) Test(org.junit.Test)

Aggregations

CatalogFunction (org.apache.flink.table.catalog.CatalogFunction)10 CatalogFunctionImpl (org.apache.flink.table.catalog.CatalogFunctionImpl)7 Test (org.junit.Test)4 ValidationException (org.apache.flink.table.api.ValidationException)3 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)3 CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)3 Catalog (org.apache.flink.table.catalog.Catalog)2 FunctionLanguage (org.apache.flink.table.catalog.FunctionLanguage)2 UnresolvedIdentifier (org.apache.flink.table.catalog.UnresolvedIdentifier)2 DatabaseNotExistException (org.apache.flink.table.catalog.exceptions.DatabaseNotExistException)2 FunctionAlreadyExistException (org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException)2 FunctionNotExistException (org.apache.flink.table.catalog.exceptions.FunctionNotExistException)2 CreateCatalogFunctionOperation (org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation)2 Function (org.apache.hadoop.hive.metastore.api.Function)2 IOException (java.io.IOException)1 ArrayList (java.util.ArrayList)1 HashMap (java.util.HashMap)1 ExecutionException (java.util.concurrent.ExecutionException)1 SqlNode (org.apache.calcite.sql.SqlNode)1 SqlParserException (org.apache.flink.table.api.SqlParserException)1