Search in sources :

Example 1 with TDeserializer

use of org.apache.thrift.TDeserializer in project hive by apache.

the class AcidTableSerializer method decode.

/** Returns the {@link AcidTable} instance decoded from a base 64 representation. */
public static AcidTable decode(String encoded) throws IOException {
    if (!encoded.startsWith(PROLOG_V1)) {
        throw new IllegalStateException("Unsupported version.");
    }
    encoded = encoded.substring(PROLOG_V1.length());
    byte[] decoded = Base64.decodeBase64(encoded);
    AcidTable table = null;
    try (DataInputStream in = new DataInputStream(new ByteArrayInputStream(decoded))) {
        String databaseName = in.readUTF();
        String tableName = in.readUTF();
        boolean createPartitions = in.readBoolean();
        long transactionId = in.readLong();
        TableType tableType = TableType.valueOf(in.readByte());
        int thriftLength = in.readInt();
        table = new AcidTable(databaseName, tableName, createPartitions, tableType);
        table.setTransactionId(transactionId);
        Table metaTable = null;
        if (thriftLength > 0) {
            metaTable = new Table();
            try {
                byte[] thriftEncoded = new byte[thriftLength];
                in.readFully(thriftEncoded, 0, thriftLength);
                new TDeserializer(new TCompactProtocol.Factory()).deserialize(metaTable, thriftEncoded);
                table.setTable(metaTable);
            } catch (TException e) {
                throw new IOException("Error deserializing meta store table.", e);
            }
        }
    }
    return table;
}
Also used : TException(org.apache.thrift.TException) Table(org.apache.hadoop.hive.metastore.api.Table) TDeserializer(org.apache.thrift.TDeserializer) LoggerFactory(org.slf4j.LoggerFactory) IOException(java.io.IOException) DataInputStream(java.io.DataInputStream) ByteArrayInputStream(java.io.ByteArrayInputStream)

Example 2 with TDeserializer

use of org.apache.thrift.TDeserializer in project hive by apache.

the class MetadataJSONSerializer method deserializeTable.

@Override
public HCatTable deserializeTable(String hcatTableStringRep) throws HCatException {
    try {
        Table table = new Table();
        new TDeserializer(new TJSONProtocol.Factory()).deserialize(table, hcatTableStringRep, "UTF-8");
        return new HCatTable(table);
    } catch (TException exception) {
        if (LOG.isDebugEnabled())
            LOG.debug("Could not de-serialize from: " + hcatTableStringRep);
        throw new HCatException("Could not de-serialize HCatTable.", exception);
    }
}
Also used : TException(org.apache.thrift.TException) Table(org.apache.hadoop.hive.metastore.api.Table) TDeserializer(org.apache.thrift.TDeserializer) HCatException(org.apache.hive.hcatalog.common.HCatException) LoggerFactory(org.slf4j.LoggerFactory)

Example 3 with TDeserializer

use of org.apache.thrift.TDeserializer in project hive by apache.

the class MetadataJSONSerializer method deserializePartition.

@Override
public HCatPartition deserializePartition(String hcatPartitionStringRep) throws HCatException {
    try {
        Partition partition = new Partition();
        new TDeserializer(new TJSONProtocol.Factory()).deserialize(partition, hcatPartitionStringRep, "UTF-8");
        return new HCatPartition(null, partition);
    } catch (TException exception) {
        if (LOG.isDebugEnabled())
            LOG.debug("Could not de-serialize partition from: " + hcatPartitionStringRep);
        throw new HCatException("Could not de-serialize HCatPartition.", exception);
    }
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) TDeserializer(org.apache.thrift.TDeserializer) HCatException(org.apache.hive.hcatalog.common.HCatException) LoggerFactory(org.slf4j.LoggerFactory)

Example 4 with TDeserializer

use of org.apache.thrift.TDeserializer in project hive by apache.

the class EximUtil method readMetaData.

public static ReadMetaData readMetaData(FileSystem fs, Path metadataPath) throws IOException, SemanticException {
    FSDataInputStream mdstream = null;
    try {
        mdstream = fs.open(metadataPath);
        byte[] buffer = new byte[1024];
        ByteArrayOutputStream sb = new ByteArrayOutputStream();
        int read = mdstream.read(buffer);
        while (read != -1) {
            sb.write(buffer, 0, read);
            read = mdstream.read(buffer);
        }
        String md = new String(sb.toByteArray(), "UTF-8");
        JSONObject jsonContainer = new JSONObject(md);
        String version = jsonContainer.getString("version");
        String fcversion = getJSONStringEntry(jsonContainer, "fcversion");
        checkCompatibility(version, fcversion);
        String dbDesc = getJSONStringEntry(jsonContainer, "db");
        String tableDesc = getJSONStringEntry(jsonContainer, "table");
        TDeserializer deserializer = new TDeserializer(new TJSONProtocol.Factory());
        Database db = null;
        if (dbDesc != null) {
            db = new Database();
            deserializer.deserialize(db, dbDesc, "UTF-8");
        }
        Table table = null;
        List<Partition> partitionsList = null;
        if (tableDesc != null) {
            table = new Table();
            deserializer.deserialize(table, tableDesc, "UTF-8");
            // TODO : jackson-streaming-iterable-redo this
            JSONArray jsonPartitions = new JSONArray(jsonContainer.getString("partitions"));
            partitionsList = new ArrayList<Partition>(jsonPartitions.length());
            for (int i = 0; i < jsonPartitions.length(); ++i) {
                String partDesc = jsonPartitions.getString(i);
                Partition partition = new Partition();
                deserializer.deserialize(partition, partDesc, "UTF-8");
                partitionsList.add(partition);
            }
        }
        return new ReadMetaData(db, table, partitionsList, readReplicationSpec(jsonContainer));
    } catch (JSONException e) {
        throw new SemanticException(ErrorMsg.ERROR_SERIALIZE_METADATA.getMsg(), e);
    } catch (TException e) {
        throw new SemanticException(ErrorMsg.ERROR_SERIALIZE_METADATA.getMsg(), e);
    } finally {
        if (mdstream != null) {
            mdstream.close();
        }
    }
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) TDeserializer(org.apache.thrift.TDeserializer) Table(org.apache.hadoop.hive.metastore.api.Table) JSONArray(org.json.JSONArray) JSONException(org.json.JSONException) ByteArrayOutputStream(java.io.ByteArrayOutputStream) TJSONProtocol(org.apache.thrift.protocol.TJSONProtocol) JSONObject(org.json.JSONObject) Database(org.apache.hadoop.hive.metastore.api.Database) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream)

Example 5 with TDeserializer

use of org.apache.thrift.TDeserializer in project storm by apache.

the class ThriftSerializationDelegate method deserialize.

@Override
public <T> T deserialize(byte[] bytes, Class<T> clazz) {
    try {
        TBase instance = (TBase) clazz.newInstance();
        new TDeserializer().deserialize(instance, bytes);
        return (T) instance;
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : TDeserializer(org.apache.thrift.TDeserializer) TBase(org.apache.thrift.TBase) TException(org.apache.thrift.TException)

Aggregations

TDeserializer (org.apache.thrift.TDeserializer)30 TException (org.apache.thrift.TException)19 IOException (java.io.IOException)8 TBase (org.apache.thrift.TBase)6 Table (org.apache.hadoop.hive.metastore.api.Table)4 TBinaryProtocol (org.apache.cassandra.thrift.TBinaryProtocol)3 HCatException (org.apache.hive.hcatalog.common.HCatException)3 ThriftSerializedObject (org.apache.storm.generated.ThriftSerializedObject)3 TJSONProtocol (org.apache.thrift.protocol.TJSONProtocol)3 LoggerFactory (org.slf4j.LoggerFactory)3 RT (clojure.lang.RT)2 FileNotFoundException (java.io.FileNotFoundException)2 ArrayList (java.util.ArrayList)2 HashMap (java.util.HashMap)2 SlicePredicate (org.apache.cassandra.thrift.SlicePredicate)2 ExecuteException (org.apache.commons.exec.ExecuteException)2 Partition (org.apache.hadoop.hive.metastore.api.Partition)2 ParseException (org.json.simple.parser.ParseException)2 Example (com.airbnb.aerosolve.core.Example)1 AgentStatMemoryGcBo (com.navercorp.pinpoint.common.server.bo.AgentStatMemoryGcBo)1