Search in sources :

Example 6 with WindowingException

use of com.sap.hadoop.windowing.WindowingException in project SQLWindowing by hbutani.

the class HiveUtils method addTableasJobInput.

@SuppressWarnings("unchecked")
public static List<FieldSchema> addTableasJobInput(String db, String table, JobConf job, FileSystem fs) throws WindowingException {
    LOG.info("HiveUtils::addTableasJobInput invoked");
    try {
        HiveMetaStoreClient client = getClient(job);
        // 1. get Table details from Hive metastore
        db = validateDB(client, db);
        Table t = getTable(client, db, table);
        StorageDescriptor sd = t.getSd();
        // 2. add table's location to job input
        FileInputFormat.addInputPath(job, new Path(sd.getLocation()));
        // 3. set job inputFormatClass, extract from StorageDescriptor
        Class<? extends InputFormat<? extends Writable, ? extends Writable>> inputFormatClass = (Class<? extends InputFormat<? extends Writable, ? extends Writable>>) Class.forName(sd.getInputFormat());
        job.setInputFormat(inputFormatClass);
        return client.getFields(db, table);
    } catch (WindowingException w) {
        throw w;
    } catch (Exception e) {
        throw new WindowingException(e);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) Table(org.apache.hadoop.hive.metastore.api.Table) InputFormat(org.apache.hadoop.mapred.InputFormat) FileInputFormat(org.apache.hadoop.mapred.FileInputFormat) StorageDescriptor(org.apache.hadoop.hive.metastore.api.StorageDescriptor) WindowingException(com.sap.hadoop.windowing.WindowingException) Writable(org.apache.hadoop.io.Writable) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) WindowingException(com.sap.hadoop.windowing.WindowingException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException)

Example 7 with WindowingException

use of com.sap.hadoop.windowing.WindowingException in project SQLWindowing by hbutani.

the class HiveUtils method getFields.

public static List<FieldSchema> getFields(String db, String table, JobConf job) throws WindowingException {
    LOG.info("HiveUtils::getFields invoked");
    try {
        HiveMetaStoreClient client = getClient(job);
        db = validateDB(client, db);
        getTable(client, db, table);
        return client.getFields(db, table);
    } catch (WindowingException w) {
        throw w;
    } catch (Exception e) {
        throw new WindowingException(e);
    }
}
Also used : HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) WindowingException(com.sap.hadoop.windowing.WindowingException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) WindowingException(com.sap.hadoop.windowing.WindowingException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException)

Example 8 with WindowingException

use of com.sap.hadoop.windowing.WindowingException in project SQLWindowing by hbutani.

the class HiveUtils method getDeserializer.

public static Deserializer getDeserializer(String db, String table, Configuration conf) throws WindowingException {
    LOG.info("HiveUtils::getDeserializer invoked");
    try {
        HiveMetaStoreClient client = getClient(conf);
        db = validateDB(client, db);
        Table t = getTable(client, db, table);
        return MetaStoreUtils.getDeserializer(conf, t);
    } catch (WindowingException w) {
        throw w;
    } catch (Exception e) {
        throw new WindowingException(e);
    }
}
Also used : HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) Table(org.apache.hadoop.hive.metastore.api.Table) WindowingException(com.sap.hadoop.windowing.WindowingException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) WindowingException(com.sap.hadoop.windowing.WindowingException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException)

Example 9 with WindowingException

use of com.sap.hadoop.windowing.WindowingException in project SQLWindowing by hbutani.

the class WindowSpecTranslation method translateWindowFrame.

static WindowFrameDef translateWindowFrame(QueryDef qDef, WindowFrameSpec wfSpec, InputInfo iInfo) throws WindowingException {
    if (wfSpec == null) {
        return null;
    }
    BoundarySpec s = wfSpec.getStart();
    BoundarySpec e = wfSpec.getEnd();
    WindowFrameDef wfDef = new WindowFrameDef(wfSpec);
    wfDef.setStart(translateBoundary(qDef, s, iInfo));
    wfDef.setEnd(translateBoundary(qDef, e, iInfo));
    int cmp = s.compareTo(e);
    if (cmp > 0) {
        throw new WindowingException(sprintf("Window range invalid, start boundary is greater than end boundary: %s", wfSpec));
    }
    return wfDef;
}
Also used : WindowFrameDef(com.sap.hadoop.windowing.query2.definition.WindowFrameDef) WindowingException(com.sap.hadoop.windowing.WindowingException) ValueBoundarySpec(com.sap.hadoop.windowing.query2.specification.WindowFrameSpec.ValueBoundarySpec) RangeBoundarySpec(com.sap.hadoop.windowing.query2.specification.WindowFrameSpec.RangeBoundarySpec) BoundarySpec(com.sap.hadoop.windowing.query2.specification.WindowFrameSpec.BoundarySpec)

Example 10 with WindowingException

use of com.sap.hadoop.windowing.WindowingException in project SQLWindowing by hbutani.

the class Partition method getAt.

public Object getAt(int i) throws WindowingException {
    try {
        elems.get(i, wRow);
        Object o = serDe.deserialize(wRow);
        return o;
    } catch (Exception se) {
        throw new WindowingException(se);
    }
}
Also used : WindowingException(com.sap.hadoop.windowing.WindowingException) WindowingException(com.sap.hadoop.windowing.WindowingException) ConcurrentModificationException(java.util.ConcurrentModificationException)

Aggregations

WindowingException (com.sap.hadoop.windowing.WindowingException)62 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)18 SerDeException (org.apache.hadoop.hive.serde2.SerDeException)11 ObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector)10 IOException (java.io.IOException)9 SerDe (org.apache.hadoop.hive.serde2.SerDe)9 ExprNodeEvaluator (org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator)8 ArrayList (java.util.ArrayList)7 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)7 HiveMetaStoreClient (org.apache.hadoop.hive.metastore.HiveMetaStoreClient)6 Properties (java.util.Properties)5 Path (org.apache.hadoop.fs.Path)5 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)5 Table (org.apache.hadoop.hive.metastore.api.Table)5 ExprNodeDesc (org.apache.hadoop.hive.ql.plan.ExprNodeDesc)5 Writable (org.apache.hadoop.io.Writable)5 TableFuncDef (com.sap.hadoop.windowing.query2.definition.TableFuncDef)4 HiveConf (org.apache.hadoop.hive.conf.HiveConf)4 ASTNode (org.apache.hadoop.hive.ql.parse.ASTNode)4 ArgDef (com.sap.hadoop.windowing.query2.definition.ArgDef)3