Search in sources :

Example 1 with WALRecordReader

use of org.apache.hadoop.hbase.mapreduce.WALInputFormat.WALRecordReader in project hbase by apache.

the class TestWALRecordReader method testSplit.

/**
   * Create a new reader from the split, and match the edits against the passed columns.
   */
private void testSplit(InputSplit split, byte[]... columns) throws Exception {
    final WALRecordReader reader = getReader();
    reader.initialize(split, MapReduceTestUtil.createDummyMapTaskAttemptContext(conf));
    for (byte[] column : columns) {
        assertTrue(reader.nextKeyValue());
        Cell cell = reader.getCurrentValue().getCells().get(0);
        if (!Bytes.equals(column, 0, column.length, cell.getQualifierArray(), cell.getQualifierOffset(), cell.getQualifierLength())) {
            assertTrue("expected [" + Bytes.toString(column) + "], actual [" + Bytes.toString(cell.getQualifierArray(), cell.getQualifierOffset(), cell.getQualifierLength()) + "]", false);
        }
    }
    assertFalse(reader.nextKeyValue());
    reader.close();
}
Also used : WALRecordReader(org.apache.hadoop.hbase.mapreduce.WALInputFormat.WALRecordReader) Cell(org.apache.hadoop.hbase.Cell)

Aggregations

Cell (org.apache.hadoop.hbase.Cell)1 WALRecordReader (org.apache.hadoop.hbase.mapreduce.WALInputFormat.WALRecordReader)1