Search in sources :

Example 26 with DataSetException

use of co.cask.cdap.api.dataset.DataSetException in project cdap by caskdata.

the class ObjectStoreDataset method encode.

private byte[] encode(T object) {
    // encode T using schema
    ByteArrayOutputStream bos = new ByteArrayOutputStream();
    BinaryEncoder encoder = new BinaryEncoder(bos);
    try {
        this.datumWriter.encode(object, encoder);
    } catch (IOException e) {
        // SHOULD NEVER happen
        throw new DataSetException("Failed to encode object to be written: " + e.getMessage(), e);
    }
    return bos.toByteArray();
}
Also used : DataSetException(co.cask.cdap.api.dataset.DataSetException) BinaryEncoder(co.cask.cdap.common.io.BinaryEncoder) ByteArrayOutputStream(java.io.ByteArrayOutputStream) IOException(java.io.IOException)

Example 27 with DataSetException

use of co.cask.cdap.api.dataset.DataSetException in project cdap by caskdata.

the class LevelDBTable method scanPersisted.

@ReadOnly
@Override
protected Scanner scanPersisted(Scan scan) throws Exception {
    FuzzyRowFilter filter = null;
    if (scan.getFilter() != null) {
        // todo: currently we support only FuzzyRowFilter as an experimental feature
        if (scan.getFilter() instanceof FuzzyRowFilter) {
            filter = (FuzzyRowFilter) scan.getFilter();
        } else {
            throw new DataSetException("Unknown filter type: " + scan.getFilter());
        }
    }
    final Scanner scanner = core.scan(scan.getStartRow(), scan.getStopRow(), filter, null, tx);
    return new Scanner() {

        @Nullable
        @Override
        public Row next() {
            return LevelDBTable.this.next(scanner);
        }

        @Override
        public void close() {
            scanner.close();
        }
    };
}
Also used : Scanner(co.cask.cdap.api.dataset.table.Scanner) DataSetException(co.cask.cdap.api.dataset.DataSetException) FuzzyRowFilter(co.cask.cdap.data2.dataset2.lib.table.FuzzyRowFilter) ReadOnly(co.cask.cdap.api.annotation.ReadOnly)

Example 28 with DataSetException

use of co.cask.cdap.api.dataset.DataSetException in project cdap by caskdata.

the class HBaseMetricsTable method increment.

@Override
public void increment(NavigableMap<byte[], NavigableMap<byte[], Long>> updates) {
    List<Put> puts = Lists.newArrayList();
    for (Map.Entry<byte[], NavigableMap<byte[], Long>> update : updates.entrySet()) {
        Put increment = getIncrementalPut(update.getKey(), update.getValue());
        puts.add(increment);
    }
    try {
        hTable.put(puts);
        hTable.flushCommits();
    } catch (IOException e) {
        // currently there is not other way to extract that from the HBase exception than string match
        if (e.getMessage() != null && e.getMessage().contains("isn't 64 bits wide")) {
            throw new NumberFormatException("Attempted to increment a value that is not convertible to long.");
        }
        throw new DataSetException("Increment failed on table " + tableId, e);
    }
}
Also used : NavigableMap(java.util.NavigableMap) DataSetException(co.cask.cdap.api.dataset.DataSetException) IOException(java.io.IOException) Map(java.util.Map) NavigableMap(java.util.NavigableMap) SortedMap(java.util.SortedMap) Put(org.apache.hadoop.hbase.client.Put)

Example 29 with DataSetException

use of co.cask.cdap.api.dataset.DataSetException in project cdap by caskdata.

the class HBaseMetricsTable method incrementAndGet.

@Override
public long incrementAndGet(byte[] row, byte[] column, long delta) {
    Increment increment = new Increment(row);
    increment.addColumn(columnFamily, column, delta);
    try {
        Result result = hTable.increment(increment);
        return Bytes.toLong(result.getValue(columnFamily, column));
    } catch (IOException e) {
        // currently there is not other way to extract that from the HBase exception than string match
        if (e.getMessage() != null && e.getMessage().contains("isn't 64 bits wide")) {
            throw new NumberFormatException("Attempted to increment a value that is not convertible to long," + " row: " + Bytes.toStringBinary(row) + " column: " + Bytes.toStringBinary(column));
        }
        throw new DataSetException("IncrementAndGet failed on table " + tableId, e);
    }
}
Also used : DataSetException(co.cask.cdap.api.dataset.DataSetException) Increment(org.apache.hadoop.hbase.client.Increment) IOException(java.io.IOException) Result(org.apache.hadoop.hbase.client.Result)

Example 30 with DataSetException

use of co.cask.cdap.api.dataset.DataSetException in project cdap by caskdata.

the class HBaseMetricsTable method scan.

@Override
public Scanner scan(@Nullable byte[] startRow, @Nullable byte[] stopRow, @Nullable FuzzyRowFilter filter) {
    ScanBuilder scanBuilder = tableUtil.buildScan();
    configureRangeScan(scanBuilder, startRow, stopRow, filter);
    try {
        ResultScanner resultScanner = hTable.getScanner(scanBuilder.build());
        return new HBaseScanner(resultScanner, columnFamily);
    } catch (IOException e) {
        throw new DataSetException("Scan failed on table " + tableId, e);
    }
}
Also used : ResultScanner(org.apache.hadoop.hbase.client.ResultScanner) DataSetException(co.cask.cdap.api.dataset.DataSetException) ScanBuilder(co.cask.cdap.data2.util.hbase.ScanBuilder) IOException(java.io.IOException)

Aggregations

DataSetException (co.cask.cdap.api.dataset.DataSetException)35 IOException (java.io.IOException)26 Map (java.util.Map)8 ReadOnly (co.cask.cdap.api.annotation.ReadOnly)6 PartitionKey (co.cask.cdap.api.dataset.lib.PartitionKey)5 Result (co.cask.cdap.api.dataset.table.Result)5 ImmutableMap (com.google.common.collect.ImmutableMap)5 TransactionFailureException (org.apache.tephra.TransactionFailureException)5 WriteOnly (co.cask.cdap.api.annotation.WriteOnly)4 TimePartitionedFileSet (co.cask.cdap.api.dataset.lib.TimePartitionedFileSet)4 Put (co.cask.cdap.api.dataset.table.Put)4 HashMap (java.util.HashMap)4 NavigableMap (java.util.NavigableMap)4 Location (org.apache.twill.filesystem.Location)4 Test (org.junit.Test)4 PartitionNotFoundException (co.cask.cdap.api.dataset.PartitionNotFoundException)3 Row (co.cask.cdap.api.dataset.table.Row)3 DatasetId (co.cask.cdap.proto.id.DatasetId)3 Put (org.apache.hadoop.hbase.client.Put)3 TransactionAware (org.apache.tephra.TransactionAware)3