Search in sources :

Example 6 with Reader

use of org.apache.hadoop.io.SequenceFile.Reader in project hadoop by apache.

the class TestSequenceFileAppend method verifyAll4Values.

private void verifyAll4Values(Path file) throws IOException {
    Reader reader = new Reader(conf, Reader.file(file));
    assertEquals(1L, reader.next((Object) null));
    assertEquals("one", reader.getCurrentValue((Object) null));
    assertEquals(2L, reader.next((Object) null));
    assertEquals("two", reader.getCurrentValue((Object) null));
    assertEquals(3L, reader.next((Object) null));
    assertEquals("three", reader.getCurrentValue((Object) null));
    assertEquals(4L, reader.next((Object) null));
    assertEquals("four", reader.getCurrentValue((Object) null));
    assertNull(reader.next((Object) null));
    reader.close();
}
Also used : Reader(org.apache.hadoop.io.SequenceFile.Reader)

Example 7 with Reader

use of org.apache.hadoop.io.SequenceFile.Reader in project asterixdb by apache.

the class SequenceLookupReader method openFile.

@SuppressWarnings("deprecation")
@Override
protected void openFile() throws IllegalArgumentException, IOException {
    reader = new SequenceFile.Reader(fs, new Path(file.getFileName()), conf);
    key = (Writable) ReflectionUtils.newInstance(reader.getKeyClass(), conf);
    value = (Text) ReflectionUtils.newInstance(reader.getValueClass(), conf);
}
Also used : Path(org.apache.hadoop.fs.Path) Reader(org.apache.hadoop.io.SequenceFile.Reader) SequenceFile(org.apache.hadoop.io.SequenceFile)

Example 8 with Reader

use of org.apache.hadoop.io.SequenceFile.Reader in project incubator-systemml by apache.

the class TfUtils method initOffsetsReader.

private Reader initOffsetsReader(JobConf job) throws IOException {
    Path path = new Path(job.get(CSVReblockMR.ROWID_FILE_NAME));
    FileSystem fs = IOUtilFunctions.getFileSystem(path, job);
    Path[] files = MatrixReader.getSequenceFilePaths(fs, path);
    if (files.length != 1)
        throw new IOException("Expecting a single file under counters file: " + path.toString());
    Reader reader = new SequenceFile.Reader(fs, files[0], job);
    return reader;
}
Also used : Path(org.apache.hadoop.fs.Path) FileSystem(org.apache.hadoop.fs.FileSystem) MatrixReader(org.apache.sysml.runtime.io.MatrixReader) Reader(org.apache.hadoop.io.SequenceFile.Reader) IOException(java.io.IOException)

Aggregations

Reader (org.apache.hadoop.io.SequenceFile.Reader)8 Path (org.apache.hadoop.fs.Path)5 Test (org.junit.Test)3 Writer (org.apache.hadoop.io.SequenceFile.Writer)2 MatrixReader (org.apache.sysml.runtime.io.MatrixReader)2 Pipeline (com.google.cloud.dataflow.sdk.Pipeline)1 KV (com.google.cloud.dataflow.sdk.values.KV)1 IOException (java.io.IOException)1 Configuration (org.apache.hadoop.conf.Configuration)1 FileSystem (org.apache.hadoop.fs.FileSystem)1 ByteWritable (org.apache.hadoop.io.ByteWritable)1 IntWritable (org.apache.hadoop.io.IntWritable)1 SequenceFile (org.apache.hadoop.io.SequenceFile)1 Option (org.apache.hadoop.io.SequenceFile.Writer.Option)1 Text (org.apache.hadoop.io.Text)1 DefaultCodec (org.apache.hadoop.io.compress.DefaultCodec)1 GzipCodec (org.apache.hadoop.io.compress.GzipCodec)1 FileInputFormat (org.apache.hadoop.mapreduce.lib.input.FileInputFormat)1 SequenceFileInputFormat (org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat)1 FileOutputFormat (org.apache.hadoop.mapreduce.lib.output.FileOutputFormat)1