Search in sources :

Example 6 with FunctionDefinition

use of org.apache.flink.table.functions.FunctionDefinition in project flink by apache.

the class FilterUtils method getValue.

private static Comparable<?> getValue(Expression expr, Function<String, Comparable<?>> getter) {
    if (expr instanceof ValueLiteralExpression) {
        Optional<?> value = ((ValueLiteralExpression) expr).getValueAs(((ValueLiteralExpression) expr).getOutputDataType().getConversionClass());
        return (Comparable<?>) value.orElse(null);
    }
    if (expr instanceof FieldReferenceExpression) {
        return getter.apply(((FieldReferenceExpression) expr).getName());
    }
    if (expr instanceof CallExpression && expr.getChildren().size() == 1) {
        Object child = getValue(expr.getChildren().get(0), getter);
        FunctionDefinition functionDefinition = ((CallExpression) expr).getFunctionDefinition();
        if (functionDefinition.equals(UPPER)) {
            return child.toString().toUpperCase();
        } else if (functionDefinition.equals(LOWER)) {
            return child.toString().toLowerCase();
        } else {
            throw new UnsupportedOperationException(String.format("Unrecognized function definition: %s.", functionDefinition));
        }
    }
    throw new UnsupportedOperationException(expr + " not supported!");
}
Also used : ValueLiteralExpression(org.apache.flink.table.expressions.ValueLiteralExpression) FieldReferenceExpression(org.apache.flink.table.expressions.FieldReferenceExpression) FunctionDefinition(org.apache.flink.table.functions.FunctionDefinition) CallExpression(org.apache.flink.table.expressions.CallExpression)

Example 7 with FunctionDefinition

use of org.apache.flink.table.functions.FunctionDefinition in project flink by apache.

the class HiveModuleTest method testHiveBuiltInFunction.

@Test
public void testHiveBuiltInFunction() {
    FunctionDefinition fd = new HiveModule().getFunctionDefinition("reverse").get();
    HiveSimpleUDF udf = (HiveSimpleUDF) fd;
    DataType[] inputType = new DataType[] { DataTypes.STRING() };
    CallContext callContext = new HiveUDFCallContext(new Object[0], inputType);
    udf.getTypeInference(null).getOutputTypeStrategy().inferType(callContext);
    udf.open(null);
    assertEquals("cba", udf.eval("abc"));
}
Also used : HiveSimpleUDF(org.apache.flink.table.functions.hive.HiveSimpleUDF) DataType(org.apache.flink.table.types.DataType) FunctionDefinition(org.apache.flink.table.functions.FunctionDefinition) HiveUDFCallContext(org.apache.flink.table.functions.hive.HiveSimpleUDFTest.HiveUDFCallContext) HiveUDFCallContext(org.apache.flink.table.functions.hive.HiveSimpleUDFTest.HiveUDFCallContext) CallContext(org.apache.flink.table.types.inference.CallContext) Test(org.junit.Test)

Example 8 with FunctionDefinition

use of org.apache.flink.table.functions.FunctionDefinition in project flink by apache.

the class HiveParserDDLSemanticAnalyzer method convertCreateFunction.

private Operation convertCreateFunction(HiveParserASTNode ast) {
    // ^(TOK_CREATEFUNCTION identifier StringLiteral ({isTempFunction}? => TOK_TEMPORARY))
    String functionName = ast.getChild(0).getText().toLowerCase();
    boolean isTemporaryFunction = (ast.getFirstChildWithType(HiveASTParser.TOK_TEMPORARY) != null);
    String className = HiveParserBaseSemanticAnalyzer.unescapeSQLString(ast.getChild(1).getText());
    // Temp functions are not allowed to have qualified names.
    if (isTemporaryFunction && FunctionUtils.isQualifiedFunctionName(functionName)) {
        // belong to a catalog/db
        throw new ValidationException("Temporary function cannot be created with a qualified name.");
    }
    if (isTemporaryFunction) {
        FunctionDefinition funcDefinition = funcDefFactory.createFunctionDefinition(functionName, new CatalogFunctionImpl(className, FunctionLanguage.JAVA));
        return new CreateTempSystemFunctionOperation(functionName, false, funcDefinition);
    } else {
        ObjectIdentifier identifier = parseObjectIdentifier(functionName);
        CatalogFunction catalogFunction = new CatalogFunctionImpl(className, FunctionLanguage.JAVA);
        return new CreateCatalogFunctionOperation(identifier, catalogFunction, false, false);
    }
}
Also used : CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) ValidationException(org.apache.flink.table.api.ValidationException) FunctionDefinition(org.apache.flink.table.functions.FunctionDefinition) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 9 with FunctionDefinition

use of org.apache.flink.table.functions.FunctionDefinition in project flink by apache.

the class ValuesOperationFactory method convertToExpectedType.

private Optional<ResolvedExpression> convertToExpectedType(ResolvedExpression sourceExpression, DataType targetDataType, ExpressionResolver.PostResolverFactory postResolverFactory) {
    LogicalType sourceLogicalType = sourceExpression.getOutputDataType().getLogicalType();
    LogicalType targetLogicalType = targetDataType.getLogicalType();
    // if the expression is a literal try converting the literal in place instead of casting
    if (sourceExpression instanceof ValueLiteralExpression) {
        // Assign a type to a null literal
        if (sourceLogicalType.is(NULL)) {
            return Optional.of(valueLiteral(null, targetDataType));
        }
        // Check if the source value class is a valid input conversion class of the target type
        // It may happen that a user wanted to use a secondary input conversion class as a value
        // for
        // a different type than what we derived.
        // 
        // Example: we interpreted 1L as BIGINT, but user wanted to interpret it as a TIMESTAMP
        // In this case long is a valid conversion class for TIMESTAMP, but a
        // cast from BIGINT to TIMESTAMP is an invalid operation.
        Optional<Object> value = ((ValueLiteralExpression) sourceExpression).getValueAs(Object.class);
        if (value.isPresent() && targetLogicalType.supportsInputConversion(value.get().getClass())) {
            ValueLiteralExpression convertedLiteral = valueLiteral(value.get(), targetDataType.notNull().bridgedTo(value.get().getClass()));
            if (targetLogicalType.isNullable()) {
                return Optional.of(postResolverFactory.cast(convertedLiteral, targetDataType));
            } else {
                return Optional.of(convertedLiteral);
            }
        }
    }
    if (sourceExpression instanceof CallExpression) {
        FunctionDefinition functionDefinition = ((CallExpression) sourceExpression).getFunctionDefinition();
        if (functionDefinition == BuiltInFunctionDefinitions.ROW && targetLogicalType.is(ROW)) {
            return convertRowToExpectedType(sourceExpression, (FieldsDataType) targetDataType, postResolverFactory);
        } else if (functionDefinition == BuiltInFunctionDefinitions.ARRAY && targetLogicalType.is(ARRAY)) {
            return convertArrayToExpectedType(sourceExpression, (CollectionDataType) targetDataType, postResolverFactory);
        } else if (functionDefinition == BuiltInFunctionDefinitions.MAP && targetLogicalType.is(MAP)) {
            return convertMapToExpectedType(sourceExpression, (KeyValueDataType) targetDataType, postResolverFactory);
        }
    }
    // might know that a certain function will not produce nullable values for a given input
    if (supportsExplicitCast(sourceLogicalType.copy(true), targetLogicalType.copy(true))) {
        return Optional.of(postResolverFactory.cast(sourceExpression, targetDataType));
    } else {
        return Optional.empty();
    }
}
Also used : ValueLiteralExpression(org.apache.flink.table.expressions.ValueLiteralExpression) CollectionDataType(org.apache.flink.table.types.CollectionDataType) LogicalType(org.apache.flink.table.types.logical.LogicalType) FunctionDefinition(org.apache.flink.table.functions.FunctionDefinition) CallExpression(org.apache.flink.table.expressions.CallExpression)

Example 10 with FunctionDefinition

use of org.apache.flink.table.functions.FunctionDefinition in project flink by apache.

the class FunctionCatalog method resolvePreciseFunctionReference.

private Optional<ContextResolvedFunction> resolvePreciseFunctionReference(ObjectIdentifier oi) {
    // resolve order:
    // 1. Temporary functions
    // 2. Catalog functions
    ObjectIdentifier normalizedIdentifier = FunctionIdentifier.normalizeObjectIdentifier(oi);
    CatalogFunction potentialResult = tempCatalogFunctions.get(normalizedIdentifier);
    if (potentialResult != null) {
        return Optional.of(ContextResolvedFunction.temporary(FunctionIdentifier.of(oi), getFunctionDefinition(oi.getObjectName(), potentialResult)));
    }
    Optional<Catalog> catalogOptional = catalogManager.getCatalog(oi.getCatalogName());
    if (catalogOptional.isPresent()) {
        Catalog catalog = catalogOptional.get();
        try {
            CatalogFunction catalogFunction = catalog.getFunction(new ObjectPath(oi.getDatabaseName(), oi.getObjectName()));
            FunctionDefinition fd;
            if (catalog.getFunctionDefinitionFactory().isPresent() && catalogFunction.getFunctionLanguage() != FunctionLanguage.PYTHON) {
                fd = catalog.getFunctionDefinitionFactory().get().createFunctionDefinition(oi.getObjectName(), catalogFunction);
            } else {
                fd = getFunctionDefinition(oi.asSummaryString(), catalogFunction);
            }
            return Optional.of(ContextResolvedFunction.permanent(FunctionIdentifier.of(oi), fd));
        } catch (FunctionNotExistException e) {
        // Ignore
        }
    }
    return Optional.empty();
}
Also used : FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) AggregateFunctionDefinition(org.apache.flink.table.functions.AggregateFunctionDefinition) TableAggregateFunctionDefinition(org.apache.flink.table.functions.TableAggregateFunctionDefinition) TableFunctionDefinition(org.apache.flink.table.functions.TableFunctionDefinition) ScalarFunctionDefinition(org.apache.flink.table.functions.ScalarFunctionDefinition) FunctionDefinition(org.apache.flink.table.functions.FunctionDefinition)

Aggregations

FunctionDefinition (org.apache.flink.table.functions.FunctionDefinition)18 AggregateFunctionDefinition (org.apache.flink.table.functions.AggregateFunctionDefinition)7 ScalarFunctionDefinition (org.apache.flink.table.functions.ScalarFunctionDefinition)7 CallExpression (org.apache.flink.table.expressions.CallExpression)6 ValueLiteralExpression (org.apache.flink.table.expressions.ValueLiteralExpression)6 TableFunctionDefinition (org.apache.flink.table.functions.TableFunctionDefinition)6 BuiltInFunctionDefinition (org.apache.flink.table.functions.BuiltInFunctionDefinition)5 TableException (org.apache.flink.table.api.TableException)4 ResolvedExpression (org.apache.flink.table.expressions.ResolvedExpression)4 TableAggregateFunctionDefinition (org.apache.flink.table.functions.TableAggregateFunctionDefinition)4 RexNode (org.apache.calcite.rex.RexNode)3 ValidationException (org.apache.flink.table.api.ValidationException)3 Expression (org.apache.flink.table.expressions.Expression)3 FieldReferenceExpression (org.apache.flink.table.expressions.FieldReferenceExpression)3 ImmutableList (com.google.common.collect.ImmutableList)2 BigDecimal (java.math.BigDecimal)2 ArrayList (java.util.ArrayList)2 List (java.util.List)2 RelDataType (org.apache.calcite.rel.type.RelDataType)2 SqlBasicCall (org.apache.calcite.sql.SqlBasicCall)2