Search in sources :

Example 61 with HiveMetaStoreClient

use of org.apache.hadoop.hive.metastore.HiveMetaStoreClient in project hive by apache.

the class SmokeTest method main.

public static void main(String[] args) throws Exception {
    SmokeTest test = new SmokeTest();
    conf = MetastoreConf.newMetastoreConf();
    IMetaStoreClient client = new HiveMetaStoreClient(conf);
    test.runTest(client);
}
Also used : HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) IMetaStoreClient(org.apache.hadoop.hive.metastore.IMetaStoreClient)

Example 62 with HiveMetaStoreClient

use of org.apache.hadoop.hive.metastore.HiveMetaStoreClient in project SQLWindowing by hbutani.

the class HiveUtils method getTable.

public static Table getTable(String db, String tableName, Configuration conf) throws WindowingException {
    HiveMetaStoreClient client = getClient(conf);
    db = validateDB(client, db);
    return getTable(client, db, tableName);
}
Also used : HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient)

Example 63 with HiveMetaStoreClient

use of org.apache.hadoop.hive.metastore.HiveMetaStoreClient in project SQLWindowing by hbutani.

the class HiveUtils method addTableasJobInput.

@SuppressWarnings("unchecked")
public static List<FieldSchema> addTableasJobInput(String db, String table, JobConf job, FileSystem fs) throws WindowingException {
    LOG.info("HiveUtils::addTableasJobInput invoked");
    try {
        HiveMetaStoreClient client = getClient(job);
        // 1. get Table details from Hive metastore
        db = validateDB(client, db);
        Table t = getTable(client, db, table);
        StorageDescriptor sd = t.getSd();
        // 2. add table's location to job input
        FileInputFormat.addInputPath(job, new Path(sd.getLocation()));
        // 3. set job inputFormatClass, extract from StorageDescriptor
        Class<? extends InputFormat<? extends Writable, ? extends Writable>> inputFormatClass = (Class<? extends InputFormat<? extends Writable, ? extends Writable>>) Class.forName(sd.getInputFormat());
        job.setInputFormat(inputFormatClass);
        return client.getFields(db, table);
    } catch (WindowingException w) {
        throw w;
    } catch (Exception e) {
        throw new WindowingException(e);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) Table(org.apache.hadoop.hive.metastore.api.Table) TextInputFormat(org.apache.hadoop.mapred.TextInputFormat) InputFormat(org.apache.hadoop.mapred.InputFormat) FileInputFormat(org.apache.hadoop.mapred.FileInputFormat) StorageDescriptor(org.apache.hadoop.hive.metastore.api.StorageDescriptor) WindowingException(com.sap.hadoop.windowing.WindowingException) Writable(org.apache.hadoop.io.Writable) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) WindowingException(com.sap.hadoop.windowing.WindowingException)

Example 64 with HiveMetaStoreClient

use of org.apache.hadoop.hive.metastore.HiveMetaStoreClient in project SQLWindowing by hbutani.

the class HiveUtils method getFields.

public static List<FieldSchema> getFields(String db, String table, JobConf job) throws WindowingException {
    LOG.info("HiveUtils::getFields invoked");
    try {
        HiveMetaStoreClient client = getClient(job);
        db = validateDB(client, db);
        getTable(client, db, table);
        return client.getFields(db, table);
    } catch (WindowingException w) {
        throw w;
    } catch (Exception e) {
        throw new WindowingException(e);
    }
}
Also used : HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) WindowingException(com.sap.hadoop.windowing.WindowingException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) WindowingException(com.sap.hadoop.windowing.WindowingException)

Example 65 with HiveMetaStoreClient

use of org.apache.hadoop.hive.metastore.HiveMetaStoreClient in project SQLWindowing by hbutani.

the class HiveUtils method getDeserializer.

public static Deserializer getDeserializer(String db, String table, Configuration conf) throws WindowingException {
    LOG.info("HiveUtils::getDeserializer invoked");
    try {
        HiveMetaStoreClient client = getClient(conf);
        db = validateDB(client, db);
        Table t = getTable(client, db, table);
        return MetaStoreUtils.getDeserializer(conf, t);
    } catch (WindowingException w) {
        throw w;
    } catch (Exception e) {
        throw new WindowingException(e);
    }
}
Also used : HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) Table(org.apache.hadoop.hive.metastore.api.Table) WindowingException(com.sap.hadoop.windowing.WindowingException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) WindowingException(com.sap.hadoop.windowing.WindowingException)

Aggregations

HiveMetaStoreClient (org.apache.hadoop.hive.metastore.HiveMetaStoreClient)141 IMetaStoreClient (org.apache.hadoop.hive.metastore.IMetaStoreClient)81 Test (org.junit.Test)78 Table (org.apache.hadoop.hive.metastore.api.Table)60 FileSystem (org.apache.hadoop.fs.FileSystem)57 Path (org.apache.hadoop.fs.Path)45 HiveConf (org.apache.hadoop.hive.conf.HiveConf)31 Before (org.junit.Before)23 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)18 FileStatus (org.apache.hadoop.fs.FileStatus)17 CliSessionState (org.apache.hadoop.hive.cli.CliSessionState)16 File (java.io.File)12 IOException (java.io.IOException)12 HiveStreamingConnection (org.apache.hive.streaming.HiveStreamingConnection)12 ArrayList (java.util.ArrayList)11 TxnStore (org.apache.hadoop.hive.metastore.txn.TxnStore)10 StreamingConnection (org.apache.hive.streaming.StreamingConnection)10 List (java.util.List)9 HashMap (java.util.HashMap)8 CompactionRequest (org.apache.hadoop.hive.metastore.api.CompactionRequest)8