Search in sources :

Example 21 with SpecificDatumWriter

use of org.apache.avro.specific.SpecificDatumWriter in project spf4j by zolyfarkas.

the class Converter method saveLabeledDumps.

public static void saveLabeledDumps(final File file, final Map<String, SampleNode> pcollected) throws IOException {
    try (OutputStream bos = newOutputStream(file)) {
        final SpecificDatumWriter<StackSampleElement> writer = new SpecificDatumWriter<>(StackSampleElement.SCHEMA$);
        final BinaryEncoder encoder = EncoderFactory.get().directBinaryEncoder(bos, null);
        encoder.writeMapStart();
        final Map<String, SampleNode> collected = pcollected.entrySet().stream().filter((e) -> e.getValue() != null).collect(Collectors.toMap((e) -> e.getKey(), (e) -> e.getValue()));
        encoder.setItemCount(collected.size());
        for (Map.Entry<String, SampleNode> entry : collected.entrySet()) {
            encoder.startItem();
            encoder.writeString(entry.getKey());
            encoder.writeArrayStart();
            Converters.convert(Methods.ROOT, entry.getValue(), -1, 0, (final StackSampleElement object) -> {
                try {
                    encoder.setItemCount(1L);
                    encoder.startItem();
                    writer.write(object, encoder);
                } catch (IOException ex) {
                    throw new UncheckedIOException(ex);
                }
            });
            encoder.writeArrayEnd();
        }
        encoder.writeMapEnd();
        encoder.flush();
    }
}
Also used : TIntObjectHashMap(gnu.trove.map.hash.TIntObjectHashMap) GZIPInputStream(java.util.zip.GZIPInputStream) BufferedInputStream(java.io.BufferedInputStream) URLDecoder(java.net.URLDecoder) TIntObjectMap(gnu.trove.map.TIntObjectMap) PushbackInputStream(java.io.PushbackInputStream) HashMap(java.util.HashMap) MemorizingBufferedInputStream(org.spf4j.io.MemorizingBufferedInputStream) ParametersAreNonnullByDefault(javax.annotation.ParametersAreNonnullByDefault) BufferedOutputStream(java.io.BufferedOutputStream) Decoder(org.apache.avro.io.Decoder) SpecificDatumWriter(org.apache.avro.specific.SpecificDatumWriter) Map(java.util.Map) NoSuchElementException(java.util.NoSuchElementException) SampleNode(org.spf4j.stackmonitor.SampleNode) Nullable(javax.annotation.Nullable) EncoderFactory(org.apache.avro.io.EncoderFactory) OutputStream(java.io.OutputStream) WillNotClose(javax.annotation.WillNotClose) Iterator(java.util.Iterator) Files(java.nio.file.Files) Methods(org.spf4j.base.Methods) BinaryDecoder(org.apache.avro.io.BinaryDecoder) IOException(java.io.IOException) Collectors(java.util.stream.Collectors) File(java.io.File) StandardCharsets(java.nio.charset.StandardCharsets) SpecificDatumReader(org.apache.avro.specific.SpecificDatumReader) Converters(org.spf4j.base.avro.Converters) UncheckedIOException(java.io.UncheckedIOException) Consumer(java.util.function.Consumer) BinaryEncoder(org.apache.avro.io.BinaryEncoder) URLEncoder(java.net.URLEncoder) Method(org.spf4j.base.avro.Method) StackSampleElement(org.spf4j.base.avro.StackSampleElement) GZIPOutputStream(java.util.zip.GZIPOutputStream) UnsupportedEncodingException(java.io.UnsupportedEncodingException) SuppressFBWarnings(edu.umd.cs.findbugs.annotations.SuppressFBWarnings) DecoderFactory(org.apache.avro.io.DecoderFactory) InputStream(java.io.InputStream) ProfileFileFormat(org.spf4j.stackmonitor.ProfileFileFormat) BufferedOutputStream(java.io.BufferedOutputStream) OutputStream(java.io.OutputStream) GZIPOutputStream(java.util.zip.GZIPOutputStream) UncheckedIOException(java.io.UncheckedIOException) IOException(java.io.IOException) UncheckedIOException(java.io.UncheckedIOException) SpecificDatumWriter(org.apache.avro.specific.SpecificDatumWriter) BinaryEncoder(org.apache.avro.io.BinaryEncoder) SampleNode(org.spf4j.stackmonitor.SampleNode) TIntObjectHashMap(gnu.trove.map.hash.TIntObjectHashMap) TIntObjectMap(gnu.trove.map.TIntObjectMap) HashMap(java.util.HashMap) Map(java.util.Map) StackSampleElement(org.spf4j.base.avro.StackSampleElement)

Example 22 with SpecificDatumWriter

use of org.apache.avro.specific.SpecificDatumWriter in project spf4j by zolyfarkas.

the class Converter method save.

public static void save(final File file, final SampleNode collected) throws IOException {
    try (OutputStream bos = newOutputStream(file)) {
        final SpecificDatumWriter<StackSampleElement> writer = new SpecificDatumWriter<>(StackSampleElement.getClassSchema());
        final BinaryEncoder encoder = EncoderFactory.get().directBinaryEncoder(bos, null);
        Converters.convert(Methods.ROOT, collected, -1, 0, (StackSampleElement object) -> {
            try {
                writer.write(object, encoder);
            } catch (IOException ex) {
                throw new UncheckedIOException(ex);
            }
        });
        encoder.flush();
    }
}
Also used : BinaryEncoder(org.apache.avro.io.BinaryEncoder) BufferedOutputStream(java.io.BufferedOutputStream) OutputStream(java.io.OutputStream) GZIPOutputStream(java.util.zip.GZIPOutputStream) UncheckedIOException(java.io.UncheckedIOException) IOException(java.io.IOException) UncheckedIOException(java.io.UncheckedIOException) StackSampleElement(org.spf4j.base.avro.StackSampleElement) SpecificDatumWriter(org.apache.avro.specific.SpecificDatumWriter)

Example 23 with SpecificDatumWriter

use of org.apache.avro.specific.SpecificDatumWriter in project spring-cloud-stream by spring-cloud.

the class AvroMessageConverterSerializationTests method testOriginalContentTypeHeaderOnly.

@Test
public void testOriginalContentTypeHeaderOnly() throws Exception {
    User specificRecord = new User();
    specificRecord.setName("joe");
    Schema v1 = new Schema.Parser().parse(AvroMessageConverterSerializationTests.class.getClassLoader().getResourceAsStream("schemas/user.avsc"));
    GenericRecord genericRecord = new GenericData.Record(v1);
    genericRecord.put("name", "joe");
    SchemaRegistryClient client = new DefaultSchemaRegistryClient();
    client.register("user", "avro", v1.toString());
    AvroSchemaRegistryClientMessageConverter converter = new AvroSchemaRegistryClientMessageConverter(client, new NoOpCacheManager());
    converter.setDynamicSchemaGenerationEnabled(false);
    converter.afterPropertiesSet();
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    DatumWriter<User> writer = new SpecificDatumWriter<>(User.class);
    Encoder encoder = EncoderFactory.get().binaryEncoder(baos, null);
    writer.write(specificRecord, encoder);
    encoder.flush();
    Message source = MessageBuilder.withPayload(baos.toByteArray()).setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_OCTET_STREAM).setHeader(BinderHeaders.BINDER_ORIGINAL_CONTENT_TYPE, "application/vnd.user.v1+avro").build();
    Object converted = converter.fromMessage(source, User.class);
    Assert.assertNotNull(converted);
    Assert.assertEquals(specificRecord.getName().toString(), ((User) converted).getName().toString());
}
Also used : User(example.avro.User) Message(org.springframework.messaging.Message) Schema(org.apache.avro.Schema) NoOpCacheManager(org.springframework.cache.support.NoOpCacheManager) ByteArrayOutputStream(java.io.ByteArrayOutputStream) SpecificDatumWriter(org.apache.avro.specific.SpecificDatumWriter) AvroSchemaRegistryClientMessageConverter(org.springframework.cloud.stream.schema.avro.AvroSchemaRegistryClientMessageConverter) Encoder(org.apache.avro.io.Encoder) GenericRecord(org.apache.avro.generic.GenericRecord) GenericRecord(org.apache.avro.generic.GenericRecord) DefaultSchemaRegistryClient(org.springframework.cloud.stream.schema.client.DefaultSchemaRegistryClient) DefaultSchemaRegistryClient(org.springframework.cloud.stream.schema.client.DefaultSchemaRegistryClient) SchemaRegistryClient(org.springframework.cloud.stream.schema.client.SchemaRegistryClient) Test(org.junit.Test)

Example 24 with SpecificDatumWriter

use of org.apache.avro.specific.SpecificDatumWriter in project divolte-collector by divolte.

the class AvroGenericRecordMapperTest method testMapping.

@Test
public void testMapping() throws Exception {
    /*
         * Test what happens when JSON samples are mapped to a specific Avro schema.
         *
         * The outcome can be either:
         *  - An expected JSON result (defaults to the input JSON)
         *  - An expected exception occurs.
         *
         * A fixture can also specify specific deserialization options be [in]active.
         */
    try {
        final Object avroResult = reader.read(testFixture.jsonToMap, testFixture.avroSchema);
        // If we expected an exception, fail...
        testFixture.expectedException.ifPresent(e -> fail("Expected exception to be thrown: " + e));
        // ...otherwise verify the result...
        final JsonNode avroResultJson = JSON_MAPPER.readTree(GenericData.get().toString(avroResult));
        assertEquals(testFixture.expectedJson, avroResultJson);
        // ...and ensure it can be written out as a record.
        final DatumWriter<Object> writer = new SpecificDatumWriter<>(testFixture.avroSchema);
        try (ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream()) {
            final Encoder encoder = EncoderFactory.get().directBinaryEncoder(byteArrayOutputStream, null);
            writer.write(avroResult, encoder);
        }
    } catch (final Exception e) {
        // Suppress the exception if it was expected; otherwise rethrow.
        testFixture.expectedException.filter(ee -> ee.isInstance(e)).orElseThrow(() -> e);
    }
}
Also used : Encoder(org.apache.avro.io.Encoder) ByteArrayOutputStream(java.io.ByteArrayOutputStream) JsonProcessingException(com.fasterxml.jackson.core.JsonProcessingException) IOException(java.io.IOException) SpecificDatumWriter(org.apache.avro.specific.SpecificDatumWriter) Test(org.junit.Test)

Example 25 with SpecificDatumWriter

use of org.apache.avro.specific.SpecificDatumWriter in project avro by apache.

the class TestSeekableByteArrayInput method getSerializedMessage.

private byte[] getSerializedMessage(IndexedRecord message, Schema schema) throws Exception {
    ByteArrayOutputStream baos = new ByteArrayOutputStream(4096);
    SpecificDatumWriter<IndexedRecord> writer = new SpecificDatumWriter<>();
    try (DataFileWriter<IndexedRecord> dfw = new DataFileWriter<>(writer).create(schema, baos)) {
        dfw.append(message);
    }
    return baos.toByteArray();
}
Also used : IndexedRecord(org.apache.avro.generic.IndexedRecord) ByteArrayOutputStream(java.io.ByteArrayOutputStream) SpecificDatumWriter(org.apache.avro.specific.SpecificDatumWriter)

Aggregations

SpecificDatumWriter (org.apache.avro.specific.SpecificDatumWriter)115 ByteArrayOutputStream (java.io.ByteArrayOutputStream)53 Schema (org.apache.avro.Schema)36 BinaryEncoder (org.apache.avro.io.BinaryEncoder)30 DataFileWriter (org.apache.avro.file.DataFileWriter)27 Test (org.junit.Test)26 IOException (java.io.IOException)23 GenericRecord (org.apache.avro.generic.GenericRecord)20 Encoder (org.apache.avro.io.Encoder)18 File (java.io.File)12 ByteBuffer (java.nio.ByteBuffer)12 JsonEncoder (org.apache.avro.io.JsonEncoder)12 ArrayList (java.util.ArrayList)10 HashMap (java.util.HashMap)10 SpecificDatumReader (org.apache.avro.specific.SpecificDatumReader)9 Path (java.nio.file.Path)8 Avro1124SubjectAndIdConverter (org.apache.druid.data.input.schemarepo.Avro1124SubjectAndIdConverter)8 InMemoryRepository (org.schemarepo.InMemoryRepository)8 Repository (org.schemarepo.Repository)8 TypedSchemaRepository (org.schemarepo.api.TypedSchemaRepository)8