Search in sources :

Example 1 with ExprNodeConverter

use of org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter in project hive by apache.

the class HiveRexExecutorImpl method reduce.

@Override
public void reduce(RexBuilder rexBuilder, List<RexNode> constExps, List<RexNode> reducedValues) {
    RexNodeConverter rexNodeConverter = new RexNodeConverter(cluster);
    for (RexNode rexNode : constExps) {
        // initialize the converter
        ExprNodeConverter converter = new ExprNodeConverter("", null, null, null, new HashSet<Integer>(), cluster.getTypeFactory());
        // convert RexNode to ExprNodeGenericFuncDesc
        ExprNodeDesc expr = rexNode.accept(converter);
        if (expr instanceof ExprNodeGenericFuncDesc) {
            // folding the constant
            ExprNodeDesc constant = ConstantPropagateProcFactory.foldExpr((ExprNodeGenericFuncDesc) expr);
            if (constant != null) {
                try {
                    // convert constant back to RexNode
                    reducedValues.add(rexNodeConverter.convert((ExprNodeConstantDesc) constant));
                } catch (Exception e) {
                    LOG.warn(e.getMessage());
                    reducedValues.add(rexNode);
                }
            } else {
                reducedValues.add(rexNode);
            }
        } else {
            reducedValues.add(rexNode);
        }
    }
}
Also used : ExprNodeConstantDesc(org.apache.hadoop.hive.ql.plan.ExprNodeConstantDesc) RexNodeConverter(org.apache.hadoop.hive.ql.optimizer.calcite.translator.RexNodeConverter) ExprNodeGenericFuncDesc(org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc) ExprNodeDesc(org.apache.hadoop.hive.ql.plan.ExprNodeDesc) RexNode(org.apache.calcite.rex.RexNode) ExprNodeConverter(org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter)

Example 2 with ExprNodeConverter

use of org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter in project hive by apache.

the class HiveCalciteUtil method getExprNodes.

public static List<ExprNodeDesc> getExprNodes(List<Integer> inputRefs, RelNode inputRel, String inputTabAlias) {
    List<ExprNodeDesc> exprNodes = new ArrayList<ExprNodeDesc>();
    List<RexNode> rexInputRefs = getInputRef(inputRefs, inputRel);
    List<RexNode> exprs = inputRel.getChildExps();
    // TODO: Change ExprNodeConverter to be independent of Partition Expr
    ExprNodeConverter exprConv = new ExprNodeConverter(inputTabAlias, inputRel.getRowType(), new HashSet<Integer>(), inputRel.getCluster().getTypeFactory());
    for (int index = 0; index < rexInputRefs.size(); index++) {
        //check the corresponding expression in exprs to see if it is literal
        if (exprs != null && index < exprs.size() && exprs.get(inputRefs.get(index)) instanceof RexLiteral) {
            //because rexInputRefs represent ref expr corresponding to value in inputRefs it is used to get
            //  corresponding index
            ExprNodeDesc exprNodeDesc = exprConv.visitLiteral((RexLiteral) exprs.get(inputRefs.get(index)));
            exprNodes.add(exprNodeDesc);
        } else {
            RexNode iRef = rexInputRefs.get(index);
            exprNodes.add(iRef.accept(exprConv));
        }
    }
    return exprNodes;
}
Also used : RexLiteral(org.apache.calcite.rex.RexLiteral) ArrayList(java.util.ArrayList) ExprNodeDesc(org.apache.hadoop.hive.ql.plan.ExprNodeDesc) RexNode(org.apache.calcite.rex.RexNode) ExprNodeConverter(org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter)

Example 3 with ExprNodeConverter

use of org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter in project hive by apache.

the class RelOptHiveTable method computePartitionList.

public void computePartitionList(HiveConf conf, RexNode pruneNode, Set<Integer> partOrVirtualCols) {
    try {
        if (!hiveTblMetadata.isPartitioned() || pruneNode == null || InputFinder.bits(pruneNode).length() == 0) {
            // there is no predicate on partitioning column, we need all partitions
            // in this case.
            partitionList = PartitionPruner.prune(hiveTblMetadata, null, conf, getName(), partitionCache);
            return;
        }
        // We have valid pruning expressions, only retrieve qualifying partitions
        ExprNodeDesc pruneExpr = pruneNode.accept(new ExprNodeConverter(getName(), getRowType(), partOrVirtualCols, this.getRelOptSchema().getTypeFactory()));
        partitionList = PartitionPruner.prune(hiveTblMetadata, pruneExpr, conf, getName(), partitionCache);
    } catch (HiveException he) {
        throw new RuntimeException(he);
    }
}
Also used : HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) ExprNodeDesc(org.apache.hadoop.hive.ql.plan.ExprNodeDesc) ExprNodeConverter(org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter)

Aggregations

ExprNodeConverter (org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter)3 ExprNodeDesc (org.apache.hadoop.hive.ql.plan.ExprNodeDesc)3 RexNode (org.apache.calcite.rex.RexNode)2 ArrayList (java.util.ArrayList)1 RexLiteral (org.apache.calcite.rex.RexLiteral)1 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)1 RexNodeConverter (org.apache.hadoop.hive.ql.optimizer.calcite.translator.RexNodeConverter)1 ExprNodeConstantDesc (org.apache.hadoop.hive.ql.plan.ExprNodeConstantDesc)1 ExprNodeGenericFuncDesc (org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc)1