Search in sources :

Example 1 with FileReader

use of org.apache.avro.file.FileReader in project avro by apache.

the class TestDataFileReader method testThrottledInputStream.

@Test
public // regression test for bug AVRO-2944
void testThrottledInputStream() throws IOException {
    // AVRO-2944 describes hanging/failure in reading Avro file with performing
    // magic header check. This happens with throttled input stream,
    // where we read into buffer less bytes than requested.
    Schema legacySchema = new Schema.Parser().setValidate(false).setValidateDefaults(false).parse("{\"type\": \"record\", \"name\": \"TestSchema\", \"fields\": " + "[ {\"name\": \"id\", \"type\": [\"long\", \"null\"], \"default\": null}]}");
    File f = Files.createTempFile("testThrottledInputStream", ".avro").toFile();
    try (DataFileWriter<?> w = new DataFileWriter<>(new GenericDatumWriter<>())) {
        w.create(legacySchema, f);
        w.flush();
    }
    // Without checking for magic header, throttled input has no effect
    FileReader r = new DataFileReader(throttledInputStream(f), new GenericDatumReader<>());
    assertEquals("TestSchema", r.getSchema().getName());
    // With checking for magic header, throttled input should pass too.
    FileReader r2 = DataFileReader.openReader(throttledInputStream(f), new GenericDatumReader<>());
    assertEquals("TestSchema", r2.getSchema().getName());
}
Also used : DataFileReader(org.apache.avro.file.DataFileReader) DataFileWriter(org.apache.avro.file.DataFileWriter) FileReader(org.apache.avro.file.FileReader) DataFileReader(org.apache.avro.file.DataFileReader) File(java.io.File) Test(org.junit.Test)

Example 2 with FileReader

use of org.apache.avro.file.FileReader in project avro by a0x8o.

the class TestDataFileReader method testThrottledInputStream.

@Test
public // regression test for bug AVRO-2944
void testThrottledInputStream() throws IOException {
    // AVRO-2944 describes hanging/failure in reading Avro file with performing
    // magic header check. This happens with throttled input stream,
    // where we read into buffer less bytes than requested.
    Schema legacySchema = new Schema.Parser().setValidate(false).setValidateDefaults(false).parse("{\"type\": \"record\", \"name\": \"TestSchema\", \"fields\": " + "[ {\"name\": \"id\", \"type\": [\"long\", \"null\"], \"default\": null}]}");
    File f = Files.createTempFile("testThrottledInputStream", ".avro").toFile();
    try (DataFileWriter<?> w = new DataFileWriter<>(new GenericDatumWriter<>())) {
        w.create(legacySchema, f);
        w.flush();
    }
    // Without checking for magic header, throttled input has no effect
    FileReader r = new DataFileReader(throttledInputStream(f), new GenericDatumReader<>());
    assertEquals("TestSchema", r.getSchema().getName());
    // With checking for magic header, throttled input should pass too.
    FileReader r2 = DataFileReader.openReader(throttledInputStream(f), new GenericDatumReader<>());
    assertEquals("TestSchema", r2.getSchema().getName());
}
Also used : DataFileReader(org.apache.avro.file.DataFileReader) DataFileWriter(org.apache.avro.file.DataFileWriter) FileReader(org.apache.avro.file.FileReader) DataFileReader(org.apache.avro.file.DataFileReader) File(java.io.File) Test(org.junit.Test)

Aggregations

File (java.io.File)2 DataFileReader (org.apache.avro.file.DataFileReader)2 DataFileWriter (org.apache.avro.file.DataFileWriter)2 FileReader (org.apache.avro.file.FileReader)2 Test (org.junit.Test)2