Search in sources :

Example 11 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class PTFOperator method initializeOp.

/*
	 * 1. Find out if the operator is invoked at Map-Side or Reduce-side 
	 * 2. Get the deserialized QueryDef 
	 * 3. Reconstruct the transient variables in QueryDef 
	 * 4. Create input partition to store rows coming from previous operator
	 */
@Override
protected void initializeOp(Configuration jobConf) throws HiveException {
    hiveConf = new HiveConf(jobConf, PTFOperator.class);
    // if the parent is ExtractOperator, this invocation is from reduce-side
    Operator<? extends OperatorDesc> parentOp = getParentOperators().get(0);
    if (parentOp instanceof ExtractOperator) {
        isMapOperator = false;
    } else {
        isMapOperator = true;
    }
    // use the string from PTFDesc to get deserialized QueryDef
    qDef = (QueryDef) SerializationUtils.deserialize(new ByteArrayInputStream(conf.getQueryDefStr().getBytes()));
    try {
        reconstructQueryDef(hiveConf);
        inputPart = RuntimeUtils.createFirstPartitionForChain(qDef, inputObjInspectors[0], hiveConf, isMapOperator);
    } catch (WindowingException we) {
        throw new HiveException("Cannot create input partition for PTFOperator.", we);
    }
    // OI for ReduceSinkOperator is taken from TODO
    if (isMapOperator) {
        TableFuncDef tDef = RuntimeUtils.getFirstTableFunction(qDef);
        outputObjInspector = tDef.getMapOI();
    } else {
        outputObjInspector = qDef.getSelectList().getOI();
    }
    setupKeysWrapper(inputObjInspectors[0]);
    super.initializeOp(jobConf);
}
Also used : HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) ByteArrayInputStream(java.io.ByteArrayInputStream) WindowingException(com.sap.hadoop.windowing.WindowingException) HiveConf(org.apache.hadoop.hive.conf.HiveConf) ExtractOperator(org.apache.hadoop.hive.ql.exec.ExtractOperator) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef)

Example 12 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class PTFOperator method processMapFunction.

protected void processMapFunction() throws HiveException {
    try {
        TableFuncDef tDef = RuntimeUtils.getFirstTableFunction(qDef);
        Partition outPart = tDef.getFunction().transformRawInput(inputPart);
        PartitionIterator<Object> pItr = outPart.iterator();
        while (pItr.hasNext()) {
            Object oRow = pItr.next();
            forward(oRow, outputObjInspector);
        }
    } catch (WindowingException we) {
        throw new HiveException("Cannot close PTFOperator.", we);
    }
}
Also used : Partition(com.sap.hadoop.windowing.runtime2.Partition) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) WindowingException(com.sap.hadoop.windowing.WindowingException) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef)

Example 13 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class WindowFunctionTranslation method translate.

public static WindowFunctionDef translate(QueryDef qDef, TableFuncDef windowTableFnDef, WindowFunctionSpec wFnSpec) throws WindowingException {
    QueryTranslationInfo tInfo = qDef.getTranslationInfo();
    InputInfo iInfo = tInfo.getInputInfo(windowTableFnDef.getInput());
    WindowFunctionDef wFnDef = new WindowFunctionDef();
    wFnDef.setSpec(wFnSpec);
    /*
		 * translate args
		 */
    ArrayList<ASTNode> args = wFnSpec.getArgs();
    if (args != null) {
        for (ASTNode expr : args) {
            ArgDef argDef = translateWindowFunctionArg(qDef, windowTableFnDef, iInfo, expr);
            wFnDef.addArg(argDef);
        }
    }
    if (RANKING_FUNCS.contains(wFnSpec.getName())) {
        setupRankingArgs(qDef, windowTableFnDef, wFnDef, wFnSpec);
    }
    WindowDef wDef = translateWindowSpec(qDef, iInfo, wFnSpec);
    wFnDef.setWindow(wDef);
    validateWindowDefForWFn(windowTableFnDef, wFnDef);
    setupEvaluator(wFnDef);
    return wFnDef;
}
Also used : InputInfo(com.sap.hadoop.windowing.query2.translate.QueryTranslationInfo.InputInfo) WindowDef(com.sap.hadoop.windowing.query2.definition.WindowDef) ASTNode(org.apache.hadoop.hive.ql.parse.ASTNode) WindowFunctionDef(com.sap.hadoop.windowing.query2.definition.WindowFunctionDef) ArgDef(com.sap.hadoop.windowing.query2.definition.ArgDef)

Example 14 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class RuntimeUtils method createFirstPartitionForChain.

/**
	 * Create a new partition.
	 * The input OI is used to evaluate rows appended to the partition.
	 * The serde is determined based on whether the query has a map-phase 
	 * or not. The OI on the serde is used by PTFs to evaluate output of the 
	 * partition. 
	 * @param qDef
	 * @param oi
	 * @param hiveConf
	 * @return
	 * @throws WindowingException
	 */
public static Partition createFirstPartitionForChain(QueryDef qDef, ObjectInspector oi, HiveConf hiveConf, boolean isMapSide) throws WindowingException {
    TableFuncDef tabDef = getFirstTableFunction(qDef);
    TableFunctionEvaluator tEval = tabDef.getFunction();
    String partClassName = tEval.getPartitionClass();
    int partMemSize = tEval.getPartitionMemSize();
    Partition part = null;
    SerDe serde = tabDef.getInput().getSerde();
    part = new Partition(partClassName, partMemSize, serde, (StructObjectInspector) oi);
    return part;
}
Also used : SerDe(org.apache.hadoop.hive.serde2.SerDe) TableFunctionEvaluator(com.sap.hadoop.windowing.functions2.TableFunctionEvaluator) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef) StructObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)

Aggregations

TableFuncDef (com.sap.hadoop.windowing.query2.definition.TableFuncDef)9 WindowingException (com.sap.hadoop.windowing.WindowingException)6 InputInfo (com.sap.hadoop.windowing.query2.translate.QueryTranslationInfo.InputInfo)6 TableFunctionEvaluator (com.sap.hadoop.windowing.functions2.TableFunctionEvaluator)5 QueryInputDef (com.sap.hadoop.windowing.query2.definition.QueryInputDef)4 ArgDef (com.sap.hadoop.windowing.query2.definition.ArgDef)2 OrderColumnDef (com.sap.hadoop.windowing.query2.definition.OrderColumnDef)2 OrderDef (com.sap.hadoop.windowing.query2.definition.OrderDef)2 WindowDef (com.sap.hadoop.windowing.query2.definition.WindowDef)2 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)2 ASTNode (org.apache.hadoop.hive.ql.parse.ASTNode)2 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)2 Order (com.sap.hadoop.metadata.Order)1 TableFunctionResolver (com.sap.hadoop.windowing.functions2.TableFunctionResolver)1 ColumnDef (com.sap.hadoop.windowing.query2.definition.ColumnDef)1 HiveTableDef (com.sap.hadoop.windowing.query2.definition.HiveTableDef)1 PartitionDef (com.sap.hadoop.windowing.query2.definition.PartitionDef)1 WindowFunctionDef (com.sap.hadoop.windowing.query2.definition.WindowFunctionDef)1 HiveTableSpec (com.sap.hadoop.windowing.query2.specification.HiveTableSpec)1 TableFuncSpec (com.sap.hadoop.windowing.query2.specification.TableFuncSpec)1