Search in sources :

Example 6 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class WindowFunctionTranslation method addInputColumnsToList.

public static void addInputColumnsToList(QueryDef qDef, TableFuncDef windowTableFnDef, ArrayList<String> fieldNames, ArrayList<ObjectInspector> fieldOIs) {
    QueryTranslationInfo tInfo = qDef.getTranslationInfo();
    InputInfo iInfo = tInfo.getInputInfo(windowTableFnDef.getInput());
    StructObjectInspector OI = (StructObjectInspector) iInfo.getOI();
    for (StructField f : OI.getAllStructFieldRefs()) {
        fieldNames.add(f.getFieldName());
        fieldOIs.add(f.getFieldObjectInspector());
    }
}
Also used : InputInfo(com.sap.hadoop.windowing.query2.translate.QueryTranslationInfo.InputInfo) StructField(org.apache.hadoop.hive.serde2.objectinspector.StructField) StructObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)

Example 7 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class InputTranslation method getTableAlias.

private static String getTableAlias(QueryDef qDef, int inputNum, QueryInputDef inputDef) throws WindowingException {
    if (inputDef instanceof HiveTableDef) {
        HiveTableDef hTbldef = (HiveTableDef) inputDef;
        String db = ((HiveTableSpec) hTbldef.getSpec()).getDbName();
        String tableName = ((HiveTableSpec) hTbldef.getSpec()).getTableName();
        return db + "." + tableName;
    } else if (inputDef instanceof TableFuncDef) {
        return "ptf_" + inputNum;
    }
    throw new WindowingException(sprintf("Internal Error: attempt to translate %s", inputDef.getSpec()));
}
Also used : WindowingException(com.sap.hadoop.windowing.WindowingException) HiveTableSpec(com.sap.hadoop.windowing.query2.specification.HiveTableSpec) HiveTableDef(com.sap.hadoop.windowing.query2.definition.HiveTableDef) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef)

Example 8 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class InputTranslation method translate.

/*
	 * <ol>
	 * <li> Get the <code>TableFunctionResolver</code> for this Function from the FunctionRegistry.
	 * <li> Create the TableFuncDef object.
	 * <li> Get the InputInfo for the input to this function.
	 * <li> Translate the Arguments to this Function in the Context of the InputInfo.
	 * <li> ask the  TableFunctionResolver to create a TableFunctionEvaluator based on the Args passed in.
	 * <li> ask the TableFunctionEvaluator to setup the Map-side ObjectInspector. Gives a chance to functions that 
	 * reshape the Input before it is partitioned to define the Shape after raw data is transformed.
	 * <li> Setup the Window Definition for this Function. The Window Definition is resolved wrt to the InputDef's
	 * Shape or the MapOI, for Functions that reshape the raw input.
	 * <li> ask the TableFunctionEvaluator to setup the Output ObjectInspector for this Function.
	 * <li> setup a Serde for the Output partition based on the OutputOI. 
	 * </ol> 
	 */
private static TableFuncDef translate(QueryDef qDef, TableFuncSpec tSpec, QueryInputDef inputDef) throws WindowingException {
    QueryTranslationInfo tInfo = qDef.getTranslationInfo();
    TableFunctionResolver tFn = FunctionRegistry.getTableFunctionResolver(tSpec.getName());
    if (tFn == null) {
        throw new WindowingException(sprintf("Unknown Table Function %s", tSpec.getName()));
    }
    TableFuncDef tDef = new TableFuncDef();
    tDef.setSpec(tSpec);
    tDef.setInput(inputDef);
    InputInfo iInfo = tInfo.getInputInfo(inputDef);
    /*
		 * translate args
		 */
    ArrayList<ASTNode> args = tSpec.getArgs();
    if (args != null) {
        for (ASTNode expr : args) {
            ArgDef argDef = translateTableFunctionArg(qDef, tDef, iInfo, expr);
            tDef.addArg(argDef);
        }
    }
    tFn.initialize(qDef, tDef);
    TableFunctionEvaluator tEval = tFn.getEvaluator();
    tDef.setFunction(tEval);
    tFn.setupRawInputOI();
    tDef.setWindow(WindowSpecTranslation.translateWindow(qDef, tDef));
    tFn.setupOutputOI();
    TranslateUtils.setupSerdeAndOI(tDef, inputDef, tInfo, tEval);
    return tDef;
}
Also used : TableFunctionResolver(com.sap.hadoop.windowing.functions2.TableFunctionResolver) InputInfo(com.sap.hadoop.windowing.query2.translate.QueryTranslationInfo.InputInfo) TableFunctionEvaluator(com.sap.hadoop.windowing.functions2.TableFunctionEvaluator) WindowingException(com.sap.hadoop.windowing.WindowingException) ASTNode(org.apache.hadoop.hive.ql.parse.ASTNode) ArgDef(com.sap.hadoop.windowing.query2.definition.ArgDef) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef)

Example 9 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class MRUtils method initialize.

/**
	 * Construct the data structures containing ExprNodeDesc for partition
	 * columns and order columns. Use the input definition to construct the list
	 * of output columns for the ReduceSinkOperator
	 * 
	 * @throws WindowingException
	 */
public void initialize() throws WindowingException {
    TableFuncDef tabDef = RuntimeUtils.getFirstTableFunction(qdef);
    hiveTableDef = tabDef.getHiveTableDef();
    InputInfo inputInfo;
    ArrayList<ColumnDef> partColList = tabDef.getWindow().getPartDef().getColumns();
    TableFunctionEvaluator tEval = tabDef.getFunction();
    /*
		 * If the query has a map phase, the inputInfo is retrieved from the map
		 * output info of the table function definition. This is constructed
		 * using the map output oi of the table function definition. If the
		 * query does not have a map phase, the inputInfo is retrieved from the
		 * QueryInputDef (either HiveTableDef or HiveQueryDef) of the query.
		 */
    if (tEval.isTransformsRawInput()) {
        inputInfo = qdef.getTranslationInfo().getMapInputInfo(tabDef);
    } else {
        inputInfo = qdef.getTranslationInfo().getInputInfo(hiveTableDef);
    }
    for (ColumnDef colDef : partColList) {
        partCols.add(colDef.getExprNode());
    }
    ArrayList<OrderColumnDef> orderColList = tabDef.getWindow().getOrderDef().getColumns();
    for (OrderColumnDef colDef : orderColList) {
        Order order = colDef.getOrder();
        if (order.name().equals("ASC")) {
            orderString.append('+');
        } else {
            orderString.append('-');
        }
        orderCols.add(colDef.getExprNode());
        outputColumnNames.add(colDef.getAlias());
    }
    RowResolver rr = inputInfo.getRowResolver();
    ArrayList<ColumnInfo> colInfoList = rr.getColumnInfos();
    for (ColumnInfo colInfo : colInfoList) {
        String internalName = colInfo.getInternalName();
        TypeInfo type = colInfo.getType();
        valueCols.add(TranslateUtils.getExprDesc(internalName, type));
        outputColumnNames.add(internalName);
    }
}
Also used : Order(com.sap.hadoop.metadata.Order) OrderColumnDef(com.sap.hadoop.windowing.query2.definition.OrderColumnDef) ColumnDef(com.sap.hadoop.windowing.query2.definition.ColumnDef) OrderColumnDef(com.sap.hadoop.windowing.query2.definition.OrderColumnDef) ColumnInfo(org.apache.hadoop.hive.ql.exec.ColumnInfo) RowResolver(org.apache.hadoop.hive.ql.parse.RowResolver) TypeInfo(org.apache.hadoop.hive.serde2.typeinfo.TypeInfo) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef) InputInfo(com.sap.hadoop.windowing.query2.translate.QueryTranslationInfo.InputInfo) TableFunctionEvaluator(com.sap.hadoop.windowing.functions2.TableFunctionEvaluator)

Example 10 with TableFuncDef

use of com.sap.hadoop.windowing.query2.definition.TableFuncDef in project SQLWindowing by hbutani.

the class MRUtils method addPTFMapOperator.

/**
	 * Returns true if the query needs a map-side reshape. PTFOperator is added
	 * on the map-side before ReduceSinkOperator in this scenario.
	 * 
	 * @param qdef
	 * @return
	 * @throws WindowingException 
	 */
public static boolean addPTFMapOperator(QueryDef qdef) throws WindowingException {
    boolean hasMap = false;
    TableFuncDef tabDef = RuntimeUtils.getFirstTableFunction(qdef);
    TableFunctionEvaluator tEval = tabDef.getFunction();
    if (tEval.isTransformsRawInput()) {
        hasMap = true;
    }
    return hasMap;
}
Also used : TableFunctionEvaluator(com.sap.hadoop.windowing.functions2.TableFunctionEvaluator) TableFuncDef(com.sap.hadoop.windowing.query2.definition.TableFuncDef)

Aggregations

TableFuncDef (com.sap.hadoop.windowing.query2.definition.TableFuncDef)9 WindowingException (com.sap.hadoop.windowing.WindowingException)6 InputInfo (com.sap.hadoop.windowing.query2.translate.QueryTranslationInfo.InputInfo)6 TableFunctionEvaluator (com.sap.hadoop.windowing.functions2.TableFunctionEvaluator)5 QueryInputDef (com.sap.hadoop.windowing.query2.definition.QueryInputDef)4 ArgDef (com.sap.hadoop.windowing.query2.definition.ArgDef)2 OrderColumnDef (com.sap.hadoop.windowing.query2.definition.OrderColumnDef)2 OrderDef (com.sap.hadoop.windowing.query2.definition.OrderDef)2 WindowDef (com.sap.hadoop.windowing.query2.definition.WindowDef)2 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)2 ASTNode (org.apache.hadoop.hive.ql.parse.ASTNode)2 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)2 Order (com.sap.hadoop.metadata.Order)1 TableFunctionResolver (com.sap.hadoop.windowing.functions2.TableFunctionResolver)1 ColumnDef (com.sap.hadoop.windowing.query2.definition.ColumnDef)1 HiveTableDef (com.sap.hadoop.windowing.query2.definition.HiveTableDef)1 PartitionDef (com.sap.hadoop.windowing.query2.definition.PartitionDef)1 WindowFunctionDef (com.sap.hadoop.windowing.query2.definition.WindowFunctionDef)1 HiveTableSpec (com.sap.hadoop.windowing.query2.specification.HiveTableSpec)1 TableFuncSpec (com.sap.hadoop.windowing.query2.specification.TableFuncSpec)1