Search in sources :

Example 66 with FlinkRuntimeException

use of org.apache.flink.util.FlinkRuntimeException in project flink by apache.

the class RocksDBValueState method value.

@Override
public V value() {
    try {
        byte[] valueBytes = backend.db.get(columnFamily, serializeCurrentKeyWithGroupAndNamespace());
        if (valueBytes == null) {
            return getDefaultValue();
        }
        dataInputView.setBuffer(valueBytes);
        return valueSerializer.deserialize(dataInputView);
    } catch (IOException | RocksDBException e) {
        throw new FlinkRuntimeException("Error while retrieving data from RocksDB.", e);
    }
}
Also used : RocksDBException(org.rocksdb.RocksDBException) FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException) IOException(java.io.IOException)

Example 67 with FlinkRuntimeException

use of org.apache.flink.util.FlinkRuntimeException in project flink by apache.

the class RocksDBReducingState method mergeNamespaces.

@Override
public void mergeNamespaces(N target, Collection<N> sources) {
    if (sources == null || sources.isEmpty()) {
        return;
    }
    try {
        V current = null;
        // merge the sources to the target
        for (N source : sources) {
            if (source != null) {
                setCurrentNamespace(source);
                final byte[] sourceKey = serializeCurrentKeyWithGroupAndNamespace();
                final byte[] valueBytes = backend.db.get(columnFamily, sourceKey);
                if (valueBytes != null) {
                    backend.db.delete(columnFamily, writeOptions, sourceKey);
                    dataInputView.setBuffer(valueBytes);
                    V value = valueSerializer.deserialize(dataInputView);
                    if (current != null) {
                        current = reduceFunction.reduce(current, value);
                    } else {
                        current = value;
                    }
                }
            }
        }
        // if something came out of merging the sources, merge it or write it to the target
        if (current != null) {
            // create the target full-binary-key
            setCurrentNamespace(target);
            final byte[] targetKey = serializeCurrentKeyWithGroupAndNamespace();
            final byte[] targetValueBytes = backend.db.get(columnFamily, targetKey);
            if (targetValueBytes != null) {
                dataInputView.setBuffer(targetValueBytes);
                // target also had a value, merge
                V value = valueSerializer.deserialize(dataInputView);
                current = reduceFunction.reduce(current, value);
            }
            // serialize the resulting value
            dataOutputView.clear();
            valueSerializer.serialize(current, dataOutputView);
            // write the resulting value
            backend.db.put(columnFamily, writeOptions, targetKey, dataOutputView.getCopyOfBuffer());
        }
    } catch (Exception e) {
        throw new FlinkRuntimeException("Error while merging state in RocksDB", e);
    }
}
Also used : FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException) FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException)

Example 68 with FlinkRuntimeException

use of org.apache.flink.util.FlinkRuntimeException in project flink by apache.

the class AbstractRocksDBAppendingState method getInternal.

SV getInternal(byte[] key) {
    try {
        byte[] valueBytes = backend.db.get(columnFamily, key);
        if (valueBytes == null) {
            return null;
        }
        dataInputView.setBuffer(valueBytes);
        return valueSerializer.deserialize(dataInputView);
    } catch (IOException | RocksDBException e) {
        throw new FlinkRuntimeException("Error while retrieving data from RocksDB", e);
    }
}
Also used : RocksDBException(org.rocksdb.RocksDBException) FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException) IOException(java.io.IOException)

Example 69 with FlinkRuntimeException

use of org.apache.flink.util.FlinkRuntimeException in project flink by apache.

the class RocksDBResource method createNewColumnFamily.

/**
 * Creates and returns a new column family with the given name.
 */
public ColumnFamilyHandle createNewColumnFamily(String name) {
    try {
        final ColumnFamilyHandle columnFamily = rocksDB.createColumnFamily(new ColumnFamilyDescriptor(name.getBytes(), columnFamilyOptions));
        columnFamilyHandles.add(columnFamily);
        return columnFamily;
    } catch (Exception ex) {
        throw new FlinkRuntimeException("Could not create column family.", ex);
    }
}
Also used : FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException) ColumnFamilyDescriptor(org.rocksdb.ColumnFamilyDescriptor) ColumnFamilyHandle(org.rocksdb.ColumnFamilyHandle) FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException)

Example 70 with FlinkRuntimeException

use of org.apache.flink.util.FlinkRuntimeException in project flink by apache.

the class RocksDBTtlStateTestBase method createStateBackend.

StateBackend createStateBackend(TernaryBoolean enableIncrementalCheckpointing) {
    String dbPath;
    String checkpointPath;
    try {
        dbPath = tempFolder.newFolder().getAbsolutePath();
        checkpointPath = tempFolder.newFolder().toURI().toString();
    } catch (IOException e) {
        throw new FlinkRuntimeException("Failed to init rocksdb test state backend");
    }
    RocksDBStateBackend backend = new RocksDBStateBackend(new FsStateBackend(checkpointPath), enableIncrementalCheckpointing);
    Configuration config = new Configuration();
    backend = backend.configure(config, Thread.currentThread().getContextClassLoader());
    backend.setDbStoragePath(dbPath);
    return backend;
}
Also used : RocksDBStateBackend(org.apache.flink.contrib.streaming.state.RocksDBStateBackend) Configuration(org.apache.flink.configuration.Configuration) FlinkRuntimeException(org.apache.flink.util.FlinkRuntimeException) IOException(java.io.IOException) FsStateBackend(org.apache.flink.runtime.state.filesystem.FsStateBackend)

Aggregations

FlinkRuntimeException (org.apache.flink.util.FlinkRuntimeException)78 IOException (java.io.IOException)28 Test (org.junit.Test)13 JobID (org.apache.flink.api.common.JobID)10 HashMap (java.util.HashMap)8 ArrayList (java.util.ArrayList)7 CompletableFuture (java.util.concurrent.CompletableFuture)7 ExecutionException (java.util.concurrent.ExecutionException)7 Nonnull (javax.annotation.Nonnull)7 Configuration (org.apache.flink.configuration.Configuration)6 Collectors (java.util.stream.Collectors)5 JobGraph (org.apache.flink.runtime.jobgraph.JobGraph)5 JobResultStore (org.apache.flink.runtime.highavailability.JobResultStore)4 RocksDBException (org.rocksdb.RocksDBException)4 List (java.util.List)3 Map (java.util.Map)3 CheckpointMetrics (org.apache.flink.runtime.checkpoint.CheckpointMetrics)3 TaskStateSnapshot (org.apache.flink.runtime.checkpoint.TaskStateSnapshot)3 ExecutionAttemptID (org.apache.flink.runtime.executiongraph.ExecutionAttemptID)3 JobResult (org.apache.flink.runtime.jobmaster.JobResult)3