Search in sources :

Example 1 with HBaseSerde

use of org.apache.flink.connector.hbase.util.HBaseSerde in project flink by apache.

the class HBaseRowDataAsyncLookupFunction method open.

@Override
public void open(FunctionContext context) {
    LOG.info("start open ...");
    final ExecutorService threadPool = Executors.newFixedThreadPool(THREAD_POOL_SIZE, new ExecutorThreadFactory("hbase-async-lookup-worker", Threads.LOGGING_EXCEPTION_HANDLER));
    Configuration config = prepareRuntimeConfiguration();
    CompletableFuture<AsyncConnection> asyncConnectionFuture = ConnectionFactory.createAsyncConnection(config);
    try {
        asyncConnection = asyncConnectionFuture.get();
        table = asyncConnection.getTable(TableName.valueOf(hTableName), threadPool);
        this.cache = cacheMaxSize <= 0 || cacheExpireMs <= 0 ? null : CacheBuilder.newBuilder().recordStats().expireAfterWrite(cacheExpireMs, TimeUnit.MILLISECONDS).maximumSize(cacheMaxSize).build();
        if (cache != null && context != null) {
            context.getMetricGroup().gauge("lookupCacheHitRate", (Gauge<Double>) () -> cache.stats().hitRate());
        }
    } catch (InterruptedException | ExecutionException e) {
        LOG.error("Exception while creating connection to HBase.", e);
        throw new RuntimeException("Cannot create connection to HBase.", e);
    }
    this.serde = new HBaseSerde(hbaseTableSchema, nullStringLiteral);
    LOG.info("end open.");
}
Also used : ExecutorThreadFactory(org.apache.flink.util.concurrent.ExecutorThreadFactory) Configuration(org.apache.hadoop.conf.Configuration) ExecutorService(java.util.concurrent.ExecutorService) AsyncConnection(org.apache.hadoop.hbase.client.AsyncConnection) ExecutionException(java.util.concurrent.ExecutionException) HBaseSerde(org.apache.flink.connector.hbase.util.HBaseSerde)

Example 2 with HBaseSerde

use of org.apache.flink.connector.hbase.util.HBaseSerde in project flink by apache.

the class HBaseRowDataLookupFunction method open.

@Override
public void open(FunctionContext context) {
    LOG.info("start open ...");
    Configuration config = prepareRuntimeConfiguration();
    try {
        hConnection = ConnectionFactory.createConnection(config);
        table = (HTable) hConnection.getTable(TableName.valueOf(hTableName));
        this.cache = cacheMaxSize <= 0 || cacheExpireMs <= 0 ? null : CacheBuilder.newBuilder().recordStats().expireAfterWrite(cacheExpireMs, TimeUnit.MILLISECONDS).maximumSize(cacheMaxSize).build();
        if (cache != null) {
            context.getMetricGroup().gauge("lookupCacheHitRate", (Gauge<Double>) () -> cache.stats().hitRate());
        }
    } catch (TableNotFoundException tnfe) {
        LOG.error("Table '{}' not found ", hTableName, tnfe);
        throw new RuntimeException("HBase table '" + hTableName + "' not found.", tnfe);
    } catch (IOException ioe) {
        LOG.error("Exception while creating connection to HBase.", ioe);
        throw new RuntimeException("Cannot create connection to HBase.", ioe);
    }
    this.serde = new HBaseSerde(hbaseTableSchema, nullStringLiteral);
    LOG.info("end open.");
}
Also used : TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) Configuration(org.apache.hadoop.conf.Configuration) IOException(java.io.IOException) HBaseSerde(org.apache.flink.connector.hbase.util.HBaseSerde)

Aggregations

HBaseSerde (org.apache.flink.connector.hbase.util.HBaseSerde)2 Configuration (org.apache.hadoop.conf.Configuration)2 IOException (java.io.IOException)1 ExecutionException (java.util.concurrent.ExecutionException)1 ExecutorService (java.util.concurrent.ExecutorService)1 ExecutorThreadFactory (org.apache.flink.util.concurrent.ExecutorThreadFactory)1 TableNotFoundException (org.apache.hadoop.hbase.TableNotFoundException)1 AsyncConnection (org.apache.hadoop.hbase.client.AsyncConnection)1