Search in sources :

Example 1 with Expression

use of org.apache.spark.sql.catalyst.expressions.Expression in project beam by apache.

the class EncoderFactory method fromBeamCoder.

public static <T> Encoder<T> fromBeamCoder(Coder<T> coder) {
    Class<? super T> clazz = coder.getEncodedTypeDescriptor().getRawType();
    ClassTag<T> classTag = ClassTag$.MODULE$.apply(clazz);
    Expression serializer = new EncoderHelpers.EncodeUsingBeamCoder<>(new BoundReference(0, new ObjectType(clazz), true), coder);
    Expression deserializer = new EncoderHelpers.DecodeUsingBeamCoder<>(new Cast(new GetColumnByOrdinal(0, BinaryType), BinaryType, scala.Option.<String>empty()), classTag, coder);
    return new ExpressionEncoder<>(serializer, deserializer, classTag);
}
Also used : Cast(org.apache.spark.sql.catalyst.expressions.Cast) ObjectType(org.apache.spark.sql.types.ObjectType) GetColumnByOrdinal(org.apache.spark.sql.catalyst.analysis.GetColumnByOrdinal) Expression(org.apache.spark.sql.catalyst.expressions.Expression) ExpressionEncoder(org.apache.spark.sql.catalyst.encoders.ExpressionEncoder) BoundReference(org.apache.spark.sql.catalyst.expressions.BoundReference)

Example 2 with Expression

use of org.apache.spark.sql.catalyst.expressions.Expression in project jpmml-sparkml by jpmml.

the class ExpressionTranslatorTest method checkValue.

public static void checkValue(Object expectedValue, String sqlExpression) {
    ConverterFactory converterFactory = new ConverterFactory(Collections.emptyMap());
    SparkMLEncoder encoder = new SparkMLEncoder(ExpressionTranslatorTest.schema, converterFactory);
    Expression expression = translateInternal("SELECT " + sqlExpression + " FROM __THIS__");
    Object sparkValue = expression.eval(InternalRow.empty());
    if (expectedValue instanceof String) {
        assertEquals(expectedValue, sparkValue.toString());
    } else if (expectedValue instanceof Integer) {
        assertEquals(expectedValue, ((Number) sparkValue).intValue());
    } else if (expectedValue instanceof Float) {
        assertEquals(expectedValue, ((Number) sparkValue).floatValue());
    } else if (expectedValue instanceof Double) {
        assertEquals(expectedValue, ((Number) sparkValue).doubleValue());
    } else {
        assertEquals(expectedValue, sparkValue);
    }
    org.dmg.pmml.Expression pmmlExpression = ExpressionTranslator.translate(encoder, expression);
    pmmlExpression = AliasExpression.unwrap(pmmlExpression);
    PMML pmml = encoder.encodePMML();
    EvaluationContext context = new VirtualEvaluationContext() {

        @Override
        public FieldValue resolve(FieldName name) {
            TransformationDictionary transformationDictionary = pmml.getTransformationDictionary();
            if (transformationDictionary != null && transformationDictionary.hasDerivedFields()) {
                List<DerivedField> derivedFields = transformationDictionary.getDerivedFields();
                for (DerivedField derivedField : derivedFields) {
                    if (Objects.equals(derivedField.getName(), name)) {
                        return ExpressionUtil.evaluate(derivedField, this);
                    }
                }
            }
            return super.resolve(name);
        }
    };
    context.declareAll(Collections.emptyMap());
    FieldValue value = ExpressionUtil.evaluate(pmmlExpression, context);
    Object pmmlValue = FieldValueUtil.getValue(value);
    assertEquals(expectedValue, pmmlValue);
}
Also used : TransformationDictionary(org.dmg.pmml.TransformationDictionary) Expression(org.apache.spark.sql.catalyst.expressions.Expression) PMML(org.dmg.pmml.PMML) EvaluationContext(org.jpmml.evaluator.EvaluationContext) VirtualEvaluationContext(org.jpmml.evaluator.VirtualEvaluationContext) FieldValue(org.jpmml.evaluator.FieldValue) VirtualEvaluationContext(org.jpmml.evaluator.VirtualEvaluationContext) FieldName(org.dmg.pmml.FieldName) DerivedField(org.dmg.pmml.DerivedField)

Example 3 with Expression

use of org.apache.spark.sql.catalyst.expressions.Expression in project jpmml-sparkml by jpmml.

the class ExpressionTranslatorTest method translateInternal.

private static Expression translateInternal(String sqlStatement) {
    LogicalPlan logicalPlan = DatasetUtil.createAnalyzedLogicalPlan(ExpressionTranslatorTest.sparkSession, ExpressionTranslatorTest.schema, sqlStatement);
    List<Expression> expressions = JavaConversions.seqAsJavaList(logicalPlan.expressions());
    if (expressions.size() != 1) {
        throw new IllegalArgumentException();
    }
    return expressions.get(0);
}
Also used : Expression(org.apache.spark.sql.catalyst.expressions.Expression) LogicalPlan(org.apache.spark.sql.catalyst.plans.logical.LogicalPlan)

Example 4 with Expression

use of org.apache.spark.sql.catalyst.expressions.Expression in project carbondata by apache.

the class CarbonAntlrSqlVisitor method visitMergeIntoCarbonTable.

public CarbonMergeIntoModel visitMergeIntoCarbonTable(CarbonSqlBaseParser.MergeIntoContext ctx) throws MalformedCarbonCommandException {
    // handle the exception msg from base parser
    if (ctx.exception != null) {
        throw new MalformedCarbonCommandException("Parse failed!");
    }
    TableModel targetTable = visitMultipartIdentifier(ctx.target);
    TableModel sourceTable = visitMultipartIdentifier(ctx.source);
    // Once get these two table,
    // We can try to get CarbonTable
    // Build a matched clause list to store the when matched and when not matched clause
    int size = ctx.getChildCount();
    int currIdx = 0;
    Expression joinExpression = null;
    List<Expression> mergeExpressions = new ArrayList<>();
    List<MergeAction> mergeActions = new ArrayList<>();
    // when matched / when not matched context
    while (currIdx < size) {
        if (ctx.getChild(currIdx) instanceof CarbonSqlBaseParser.PredicatedContext) {
            // This branch will visit the Join Expression
            ctx.getChild(currIdx).getChildCount();
            joinExpression = this.visitCarbonPredicated((CarbonSqlBaseParser.PredicatedContext) ctx.getChild(currIdx));
        } else if (ctx.getChild(currIdx) instanceof CarbonSqlBaseParser.MatchedClauseContext) {
            // This branch will deal with the Matched Clause
            Expression whenMatchedExpression = null;
            // Get the whenMatched expression
            try {
                if (this.containsWhenMatchedPredicateExpression(ctx.getChild(currIdx).getChildCount())) {
                    whenMatchedExpression = sparkParser.parseExpression(((CarbonSqlBaseParser.MatchedClauseContext) ctx.getChild(currIdx)).booleanExpression().getText());
                }
            } catch (ParseException e) {
                throw new MalformedCarbonCommandException("Parse failed: " + e.getMessage());
            }
            mergeExpressions.add(whenMatchedExpression);
            mergeActions.add(visitCarbonMatchedAction((CarbonSqlBaseParser.MatchedActionContext) ctx.getChild(currIdx).getChild(ctx.getChild(currIdx).getChildCount() - 1)));
        } else if (ctx.getChild(currIdx) instanceof CarbonSqlBaseParser.NotMatchedClauseContext) {
            // This branch will deal with the Matched Clause
            Expression whenNotMatchedExpression = null;
            // Get the whenMatched expression
            try {
                if (this.containsWhenNotMatchedPredicateExpression(ctx.getChild(currIdx).getChildCount())) {
                    whenNotMatchedExpression = sparkParser.parseExpression(((CarbonSqlBaseParser.NotMatchedClauseContext) ctx.getChild(currIdx)).booleanExpression().getText());
                }
            } catch (ParseException e) {
                throw new MalformedCarbonCommandException("Parse failed: " + e.getMessage());
            }
            mergeExpressions.add(whenNotMatchedExpression);
            CarbonSqlBaseParser.NotMatchedActionContext notMatchedActionContext = (CarbonSqlBaseParser.NotMatchedActionContext) ctx.getChild(currIdx).getChild(ctx.getChild(currIdx).getChildCount() - 1);
            if (notMatchedActionContext.getChildCount() <= 2) {
                mergeActions.add(InsertAction.apply(null, true));
            } else if (notMatchedActionContext.ASTERISK() == null) {
                if (notMatchedActionContext.columns.multipartIdentifier().size() != notMatchedActionContext.expression().size()) {
                    throw new MalformedCarbonCommandException("Parse failed: size of columns " + "is not equal to size of expression in not matched action.");
                }
                Map<Column, Column> insertMap = new HashMap<>();
                for (int i = 0; i < notMatchedActionContext.columns.multipartIdentifier().size(); i++) {
                    String left = visitMultipartIdentifier(notMatchedActionContext.columns.multipartIdentifier().get(i), "").getColName();
                    String right = notMatchedActionContext.expression().get(i).getText();
                    // some times the right side is literal or expression, not table column
                    // so we need to check the left side is a column or expression
                    Column rightColumn = null;
                    try {
                        Expression expression = sparkParser.parseExpression(right);
                        rightColumn = new Column(expression);
                    } catch (Exception ex) {
                        throw new MalformedCarbonCommandException("Parse failed: " + ex.getMessage());
                    }
                    insertMap.put(new Column(left), rightColumn);
                }
                mergeActions.add(InsertAction.apply(SparkUtil.convertMap(insertMap), false));
            } else {
                mergeActions.add(InsertAction.apply(null, false));
            }
        }
        currIdx++;
    }
    return new CarbonMergeIntoModel(targetTable, sourceTable, joinExpression, mergeExpressions, mergeActions);
}
Also used : CarbonSqlBaseParser(org.apache.spark.sql.parser.CarbonSqlBaseParser) ArrayList(java.util.ArrayList) ParseException(org.apache.spark.sql.catalyst.parser.ParseException) MalformedCarbonCommandException(org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException) MergeAction(org.apache.spark.sql.execution.command.mutation.merge.MergeAction) Expression(org.apache.spark.sql.catalyst.expressions.Expression) CarbonJoinExpression(org.apache.spark.sql.merge.model.CarbonJoinExpression) MalformedCarbonCommandException(org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException) ParseException(org.apache.spark.sql.catalyst.parser.ParseException) HashMap(java.util.HashMap) Map(java.util.Map) TableModel(org.apache.spark.sql.merge.model.TableModel) CarbonMergeIntoModel(org.apache.spark.sql.merge.model.CarbonMergeIntoModel)

Example 5 with Expression

use of org.apache.spark.sql.catalyst.expressions.Expression in project carbondata by apache.

the class CarbonAntlrSqlVisitor method visitCarbonAssignmentList.

public MergeAction visitCarbonAssignmentList(CarbonSqlBaseParser.AssignmentListContext ctx) throws MalformedCarbonCommandException {
    // UPDATE SET assignmentList
    Map<Column, Column> map = new HashMap<>();
    for (int currIdx = 0; currIdx < ctx.getChildCount(); currIdx++) {
        if (ctx.getChild(currIdx) instanceof CarbonSqlBaseParser.AssignmentContext) {
            // Assume the actions are all use to pass value
            String left = ctx.getChild(currIdx).getChild(0).getText();
            if (left.split("\\.").length > 1) {
                left = left.split("\\.")[1];
            }
            String right = ctx.getChild(currIdx).getChild(2).getText();
            Column rightColumn = null;
            try {
                Expression expression = sparkParser.parseExpression(right);
                rightColumn = new Column(expression);
            } catch (Exception e) {
                throw new MalformedCarbonCommandException("Parse failed: " + e.getMessage());
            }
            map.put(new Column(left), rightColumn);
        }
    }
    return new UpdateAction(SparkUtil.convertMap(map), false);
}
Also used : HashMap(java.util.HashMap) Expression(org.apache.spark.sql.catalyst.expressions.Expression) CarbonJoinExpression(org.apache.spark.sql.merge.model.CarbonJoinExpression) UpdateAction(org.apache.spark.sql.execution.command.mutation.merge.UpdateAction) MalformedCarbonCommandException(org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException) ParseException(org.apache.spark.sql.catalyst.parser.ParseException) MalformedCarbonCommandException(org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException)

Aggregations

Expression (org.apache.spark.sql.catalyst.expressions.Expression)8 DerivedField (org.dmg.pmml.DerivedField)3 FieldName (org.dmg.pmml.FieldName)3 ArrayList (java.util.ArrayList)2 HashMap (java.util.HashMap)2 MalformedCarbonCommandException (org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException)2 Cast (org.apache.spark.sql.catalyst.expressions.Cast)2 ParseException (org.apache.spark.sql.catalyst.parser.ParseException)2 CarbonJoinExpression (org.apache.spark.sql.merge.model.CarbonJoinExpression)2 DataType (org.dmg.pmml.DataType)2 FieldRef (org.dmg.pmml.FieldRef)2 OpType (org.dmg.pmml.OpType)2 Iterator (java.util.Iterator)1 List (java.util.List)1 Map (java.util.Map)1 GetColumnByOrdinal (org.apache.spark.sql.catalyst.analysis.GetColumnByOrdinal)1 ExpressionEncoder (org.apache.spark.sql.catalyst.encoders.ExpressionEncoder)1 Abs (org.apache.spark.sql.catalyst.expressions.Abs)1 Acos (org.apache.spark.sql.catalyst.expressions.Acos)1 Add (org.apache.spark.sql.catalyst.expressions.Add)1