Search in sources :

Example 16 with TSerializer

use of org.apache.thrift.TSerializer in project vcell by virtualcell.

the class VisMeshUtils method writeChomboIndexData.

static void writeChomboIndexData(File chomboIndexFile, ChomboIndexData chomboIndexData) throws IOException {
    TSerializer serializer = new TSerializer(new TBinaryProtocol.Factory());
    try {
        byte[] blob = serializer.serialize(chomboIndexData);
        FileUtils.writeByteArrayToFile(chomboIndexFile, blob);
    } catch (TException e) {
        e.printStackTrace();
        throw new IOException("error writing ChomboIndexData to file " + chomboIndexFile.getPath() + ": " + e.getMessage(), e);
    }
}
Also used : TException(org.apache.thrift.TException) TSerializer(org.apache.thrift.TSerializer) TBinaryProtocol(org.apache.thrift.protocol.TBinaryProtocol) IOException(java.io.IOException)

Example 17 with TSerializer

use of org.apache.thrift.TSerializer in project hive by apache.

the class EximUtil method createExportDump.

public static void createExportDump(FileSystem fs, Path metadataPath, org.apache.hadoop.hive.ql.metadata.Table tableHandle, Iterable<org.apache.hadoop.hive.ql.metadata.Partition> partitions, ReplicationSpec replicationSpec) throws SemanticException, IOException {
    if (replicationSpec == null) {
        // instantiate default values if not specified
        replicationSpec = new ReplicationSpec();
    }
    if (tableHandle == null) {
        replicationSpec.setNoop(true);
    }
    OutputStream out = fs.create(metadataPath);
    JsonGenerator jgen = (new JsonFactory()).createJsonGenerator(out);
    jgen.writeStartObject();
    jgen.writeStringField("version", METADATA_FORMAT_VERSION);
    if (METADATA_FORMAT_FORWARD_COMPATIBLE_VERSION != null) {
        jgen.writeStringField("fcversion", METADATA_FORMAT_FORWARD_COMPATIBLE_VERSION);
    }
    if (replicationSpec.isInReplicationScope()) {
        for (ReplicationSpec.KEY key : ReplicationSpec.KEY.values()) {
            String value = replicationSpec.get(key);
            if (value != null) {
                jgen.writeStringField(key.toString(), value);
            }
        }
        if (tableHandle != null) {
            Table ttable = tableHandle.getTTable();
            ttable.putToParameters(ReplicationSpec.KEY.CURR_STATE_ID.toString(), replicationSpec.getCurrentReplicationState());
            if ((ttable.getParameters().containsKey("EXTERNAL")) && (ttable.getParameters().get("EXTERNAL").equalsIgnoreCase("TRUE"))) {
                // Replication destination will not be external - override if set
                ttable.putToParameters("EXTERNAL", "FALSE");
            }
            if (ttable.isSetTableType() && ttable.getTableType().equalsIgnoreCase(TableType.EXTERNAL_TABLE.toString())) {
                // Replication dest will not be external - override if set
                ttable.setTableType(TableType.MANAGED_TABLE.toString());
            }
        }
    } else {
    // ReplicationSpec.KEY scopeKey = ReplicationSpec.KEY.REPL_SCOPE;
    // write(out, ",\""+ scopeKey.toString() +"\":\"" + replicationSpec.get(scopeKey) + "\"");
    // TODO: if we want to be explicit about this dump not being a replication dump, we can
    // uncomment this else section, but currently unnneeded. Will require a lot of golden file
    // regen if we do so.
    }
    if ((tableHandle != null) && (!replicationSpec.isNoop())) {
        TSerializer serializer = new TSerializer(new TJSONProtocol.Factory());
        try {
            jgen.writeStringField("table", serializer.toString(tableHandle.getTTable(), "UTF-8"));
            jgen.writeFieldName("partitions");
            jgen.writeStartArray();
            if (partitions != null) {
                for (org.apache.hadoop.hive.ql.metadata.Partition partition : partitions) {
                    Partition tptn = partition.getTPartition();
                    if (replicationSpec.isInReplicationScope()) {
                        tptn.putToParameters(ReplicationSpec.KEY.CURR_STATE_ID.toString(), replicationSpec.getCurrentReplicationState());
                        if ((tptn.getParameters().containsKey("EXTERNAL")) && (tptn.getParameters().get("EXTERNAL").equalsIgnoreCase("TRUE"))) {
                            // Replication destination will not be external
                            tptn.putToParameters("EXTERNAL", "FALSE");
                        }
                    }
                    jgen.writeString(serializer.toString(tptn, "UTF-8"));
                    jgen.flush();
                }
            }
            jgen.writeEndArray();
        } catch (TException e) {
            throw new SemanticException(ErrorMsg.ERROR_SERIALIZE_METASTORE.getMsg(), e);
        }
    }
    jgen.writeEndObject();
    // JsonGenerator owns the OutputStream, so it closes it when we call close.
    jgen.close();
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) Table(org.apache.hadoop.hive.metastore.api.Table) ByteArrayOutputStream(java.io.ByteArrayOutputStream) OutputStream(java.io.OutputStream) JsonFactory(org.codehaus.jackson.JsonFactory) TSerializer(org.apache.thrift.TSerializer) TJSONProtocol(org.apache.thrift.protocol.TJSONProtocol) JsonGenerator(org.codehaus.jackson.JsonGenerator)

Example 18 with TSerializer

use of org.apache.thrift.TSerializer in project buck by facebook.

the class ThriftOverHttpServiceTest method testSendValidMessageAndReturnValidResponse.

@Test
public void testSendValidMessageAndReturnValidResponse() throws IOException, TException {
    FrontendRequest request = new FrontendRequest();
    request.setType(FrontendRequestType.BUILD_STATUS);
    FrontendResponse expectedResponse = new FrontendResponse();
    expectedResponse.setType(FrontendRequestType.START_BUILD);
    Capture<Request.Builder> requestBuilder = EasyMock.newCapture();
    TSerializer serializer = new TSerializer(config.getThriftProtocol().getFactory());
    final byte[] responseBuffer = serializer.serialize(expectedResponse);
    HttpResponse httpResponse = new HttpResponse() {

        @Override
        public int statusCode() {
            return 200;
        }

        @Override
        public String statusMessage() {
            return "super cool msg";
        }

        @Override
        public long contentLength() throws IOException {
            return responseBuffer.length;
        }

        @Override
        public InputStream getBody() {
            return new ByteArrayInputStream(responseBuffer);
        }

        @Override
        public String requestUrl() {
            return "super url";
        }

        @Override
        public void close() throws IOException {
        // do nothing.
        }
    };
    EasyMock.expect(httpService.makeRequest(EasyMock.eq("/thrift"), EasyMock.capture(requestBuilder))).andReturn(httpResponse).times(1);
    EasyMock.replay(httpService);
    FrontendResponse actualResponse = new FrontendResponse();
    service.makeRequest(request, actualResponse);
    Assert.assertEquals(expectedResponse, actualResponse);
    EasyMock.verify(httpService);
}
Also used : TSerializer(org.apache.thrift.TSerializer) ByteArrayInputStream(java.io.ByteArrayInputStream) FrontendResponse(com.facebook.buck.distributed.thrift.FrontendResponse) FrontendRequest(com.facebook.buck.distributed.thrift.FrontendRequest) Test(org.junit.Test)

Example 19 with TSerializer

use of org.apache.thrift.TSerializer in project druid by druid-io.

the class ThriftInputRowParserTest method testParse.

@Test
public void testParse() throws Exception {
    ThriftInputRowParser parser = new ThriftInputRowParser(parseSpec, "example/book.jar", "io.druid.data.input.thrift.Book");
    Book book = new Book().setDate("2016-08-29").setPrice(19.9).setTitle("title").setAuthor(new Author().setFirstName("first").setLastName("last"));
    TSerializer serializer;
    byte[] bytes;
    // 1. compact
    serializer = new TSerializer(new TCompactProtocol.Factory());
    bytes = serializer.serialize(book);
    serializationAndTest(parser, bytes);
    // 2. binary + base64
    serializer = new TSerializer(new TBinaryProtocol.Factory());
    serializationAndTest(parser, Base64.encodeBase64(serializer.serialize(book)));
    // 3. json
    serializer = new TSerializer(new TJSONProtocol.Factory());
    bytes = serializer.serialize(book);
    serializationAndTest(parser, bytes);
}
Also used : TSerializer(org.apache.thrift.TSerializer) Test(org.junit.Test)

Example 20 with TSerializer

use of org.apache.thrift.TSerializer in project grpc-java by grpc.

the class ThriftUtils method metadataMarshaller.

/** Produce a metadata marshaller. */
public static <T extends TBase<T, ?>> Metadata.BinaryMarshaller<T> metadataMarshaller(final MessageFactory<T> factory) {
    return new Metadata.BinaryMarshaller<T>() {

        @Override
        public byte[] toBytes(T value) {
            try {
                TSerializer serializer = new TSerializer();
                return serializer.serialize(value);
            } catch (TException e) {
                throw Status.INTERNAL.withDescription("Error in serializing Thrift Message").withCause(e).asRuntimeException();
            }
        }

        @Override
        public T parseBytes(byte[] serialized) {
            try {
                TDeserializer deserializer = new TDeserializer();
                T message = factory.newInstance();
                deserializer.deserialize(message, serialized);
                return message;
            } catch (TException e) {
                throw Status.INTERNAL.withDescription("Invalid thrift Byte Sequence").withCause(e).asRuntimeException();
            }
        }
    };
}
Also used : TException(org.apache.thrift.TException) TSerializer(org.apache.thrift.TSerializer) TDeserializer(org.apache.thrift.TDeserializer)

Aggregations

TSerializer (org.apache.thrift.TSerializer)35 TException (org.apache.thrift.TException)19 TJSONProtocol (org.apache.thrift.protocol.TJSONProtocol)14 IOException (java.io.IOException)12 TBinaryProtocol (org.apache.thrift.protocol.TBinaryProtocol)8 Test (org.junit.Test)6 TDeserializer (org.apache.thrift.TDeserializer)5 ArrayList (java.util.ArrayList)4 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)4 OptRun (org.vcell.optimization.thrift.OptRun)4 Table (org.apache.hadoop.hive.metastore.api.Table)3 TCompactProtocol (org.apache.thrift.protocol.TCompactProtocol)3 VCellApiClient (org.vcell.api.client.VCellApiClient)3 OptProblem (org.vcell.optimization.thrift.OptProblem)3 OptRunStatus (org.vcell.optimization.thrift.OptRunStatus)3 MathException (cbit.vcell.math.MathException)2 RowColumnResultSet (cbit.vcell.math.RowColumnResultSet)2 OptSolverResultSet (cbit.vcell.opt.OptSolverResultSet)2 OptRunResultSet (cbit.vcell.opt.OptSolverResultSet.OptRunResultSet)2 OptimizationException (cbit.vcell.opt.OptimizationException)2