Search in sources :

Example 21 with Cell

use of org.apache.hadoop.hbase.Cell in project hbase by apache.

the class AccessControlLists method parsePermissions.

private static ListMultimap<String, TablePermission> parsePermissions(byte[] entryName, Result result) {
    ListMultimap<String, TablePermission> perms = ArrayListMultimap.create();
    if (result != null && result.size() > 0) {
        for (Cell kv : result.rawCells()) {
            Pair<String, TablePermission> permissionsOfUserOnTable = parsePermissionRecord(entryName, kv);
            if (permissionsOfUserOnTable != null) {
                String username = permissionsOfUserOnTable.getFirst();
                TablePermission permissions = permissionsOfUserOnTable.getSecond();
                perms.put(username, permissions);
            }
        }
    }
    return perms;
}
Also used : Cell(org.apache.hadoop.hbase.Cell)

Example 22 with Cell

use of org.apache.hadoop.hbase.Cell in project hbase by apache.

the class ChainWALEntryFilter method filterCells.

private void filterCells(Entry entry) {
    if (entry == null || cellFilters.length == 0) {
        return;
    }
    ArrayList<Cell> cells = entry.getEdit().getCells();
    int size = cells.size();
    for (int i = size - 1; i >= 0; i--) {
        Cell cell = cells.get(i);
        for (WALCellFilter filter : cellFilters) {
            cell = filter.filterCell(entry, cell);
            if (cell != null) {
                cells.set(i, cell);
            } else {
                cells.remove(i);
                break;
            }
        }
    }
    if (cells.size() < size / 2) {
        cells.trimToSize();
    }
}
Also used : Cell(org.apache.hadoop.hbase.Cell)

Example 23 with Cell

use of org.apache.hadoop.hbase.Cell in project hbase by apache.

the class ProtobufLogWriter method append.

@Override
public void append(Entry entry) throws IOException {
    entry.setCompressionContext(compressionContext);
    entry.getKey().getBuilder(compressor).setFollowingKvCount(entry.getEdit().size()).build().writeDelimitedTo(output);
    for (Cell cell : entry.getEdit().getCells()) {
        // cellEncoder must assume little about the stream, since we write PB and cells in turn.
        cellEncoder.write(cell);
    }
    length.set(output.getPos());
}
Also used : Cell(org.apache.hadoop.hbase.Cell)

Example 24 with Cell

use of org.apache.hadoop.hbase.Cell in project hbase by apache.

the class AsyncProtobufLogWriter method append.

@Override
public void append(Entry entry) {
    int buffered = output.buffered();
    entry.setCompressionContext(compressionContext);
    try {
        entry.getKey().getBuilder(compressor).setFollowingKvCount(entry.getEdit().size()).build().writeDelimitedTo(asyncOutputWrapper);
    } catch (IOException e) {
        throw new AssertionError("should not happen", e);
    }
    try {
        for (Cell cell : entry.getEdit().getCells()) {
            cellEncoder.write(cell);
        }
    } catch (IOException e) {
        throw new AssertionError("should not happen", e);
    }
    length.addAndGet(output.buffered() - buffered);
}
Also used : IOException(java.io.IOException) InterruptedIOException(java.io.InterruptedIOException) Cell(org.apache.hadoop.hbase.Cell)

Example 25 with Cell

use of org.apache.hadoop.hbase.Cell in project hbase by apache.

the class FSWALEntry method stampRegionSequenceId.

/**
   * Here is where a WAL edit gets its sequenceid. SIDE-EFFECT is our stamping the sequenceid into
   * every Cell AND setting the sequenceid into the MVCC WriteEntry!!!!
   * @return The sequenceid we stamped on this edit.
   */
long stampRegionSequenceId(MultiVersionConcurrencyControl.WriteEntry we) throws IOException {
    long regionSequenceId = we.getWriteNumber();
    if (!this.getEdit().isReplay() && inMemstore) {
        for (Cell c : getEdit().getCells()) {
            CellUtil.setSequenceId(c, regionSequenceId);
        }
    }
    getKey().setWriteEntry(we);
    return regionSequenceId;
}
Also used : Cell(org.apache.hadoop.hbase.Cell)

Aggregations

Cell (org.apache.hadoop.hbase.Cell)862 Test (org.junit.Test)326 ArrayList (java.util.ArrayList)323 Scan (org.apache.hadoop.hbase.client.Scan)258 KeyValue (org.apache.hadoop.hbase.KeyValue)220 Result (org.apache.hadoop.hbase.client.Result)203 Put (org.apache.hadoop.hbase.client.Put)159 IOException (java.io.IOException)123 ResultScanner (org.apache.hadoop.hbase.client.ResultScanner)106 Get (org.apache.hadoop.hbase.client.Get)85 Table (org.apache.hadoop.hbase.client.Table)85 List (java.util.List)80 TableName (org.apache.hadoop.hbase.TableName)77 Delete (org.apache.hadoop.hbase.client.Delete)75 CellScanner (org.apache.hadoop.hbase.CellScanner)69 Configuration (org.apache.hadoop.conf.Configuration)62 InterruptedIOException (java.io.InterruptedIOException)48 Map (java.util.Map)45 Path (org.apache.hadoop.fs.Path)45 RegionScanner (org.apache.hadoop.hbase.regionserver.RegionScanner)45