Search in sources :

Example 21 with TableNotFoundException

use of org.apache.hadoop.hbase.TableNotFoundException in project incubator-atlas by apache.

the class HBaseStoreManager method ensureColumnFamilyExists.

private void ensureColumnFamilyExists(String tableName, String columnFamily, int ttlInSeconds) throws BackendException {
    AdminMask adm = null;
    try {
        adm = getAdminInterface();
        HTableDescriptor desc = ensureTableExists(tableName, columnFamily, ttlInSeconds);
        Preconditions.checkNotNull(desc);
        HColumnDescriptor cf = desc.getFamily(columnFamily.getBytes());
        // Create our column family, if necessary
        if (cf == null) {
            try {
                if (!adm.isTableDisabled(tableName)) {
                    adm.disableTable(tableName);
                }
            } catch (TableNotEnabledException e) {
                logger.debug("Table {} already disabled", tableName);
            } catch (IOException e) {
                throw new TemporaryBackendException(e);
            }
            try {
                HColumnDescriptor cdesc = new HColumnDescriptor(columnFamily);
                setCFOptions(cdesc, ttlInSeconds);
                adm.addColumn(tableName, cdesc);
                logger.debug("Added HBase ColumnFamily {}, waiting for 1 sec. to propogate.", columnFamily);
                adm.enableTable(tableName);
            } catch (TableNotFoundException ee) {
                logger.error("TableNotFoundException", ee);
                throw new PermanentBackendException(ee);
            } catch (org.apache.hadoop.hbase.TableExistsException ee) {
                logger.debug("Swallowing exception {}", ee);
            } catch (IOException ee) {
                throw new TemporaryBackendException(ee);
            }
        }
    } finally {
        IOUtils.closeQuietly(adm);
    }
}
Also used : TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) TemporaryBackendException(com.thinkaurelius.titan.diskstorage.TemporaryBackendException) HColumnDescriptor(org.apache.hadoop.hbase.HColumnDescriptor) PermanentBackendException(com.thinkaurelius.titan.diskstorage.PermanentBackendException) IOException(java.io.IOException) HTableDescriptor(org.apache.hadoop.hbase.HTableDescriptor) TableNotEnabledException(org.apache.hadoop.hbase.TableNotEnabledException)

Example 22 with TableNotFoundException

use of org.apache.hadoop.hbase.TableNotFoundException in project cdap by caskdata.

the class HBaseTableFactory method disableTable.

private void disableTable(HBaseDDLExecutor ddlExecutor, TableId tableId) throws IOException {
    try {
        TableName tableName = HTableNameConverter.toTableName(cConf.get(Constants.Dataset.TABLE_PREFIX), tableId);
        ddlExecutor.disableTableIfEnabled(tableName.getNamespaceAsString(), tableName.getQualifierAsString());
        LOG.debug("TMS Table {} has been disabled.", tableId);
    } catch (TableNotFoundException ex) {
        LOG.debug("TMS Table {} was not found. Skipping disable.", tableId, ex);
    } catch (TableNotEnabledException ex) {
        LOG.debug("TMS Table {} was already in disabled state.", tableId, ex);
    }
}
Also used : TableName(org.apache.hadoop.hbase.TableName) TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) TableNotEnabledException(org.apache.hadoop.hbase.TableNotEnabledException)

Example 23 with TableNotFoundException

use of org.apache.hadoop.hbase.TableNotFoundException in project hbase by apache.

the class ResourceBase method processException.

protected Response processException(Throwable exp) {
    Throwable curr = exp;
    if (accessDeniedClazz != null) {
        //some access denied exceptions are buried
        while (curr != null) {
            if (accessDeniedClazz.isAssignableFrom(curr.getClass())) {
                throw new WebApplicationException(Response.status(Response.Status.FORBIDDEN).type(MIMETYPE_TEXT).entity("Forbidden" + CRLF + StringUtils.stringifyException(exp) + CRLF).build());
            }
            curr = curr.getCause();
        }
    }
    //TableNotFound may also be buried one level deep
    if (exp instanceof TableNotFoundException || exp.getCause() instanceof TableNotFoundException) {
        throw new WebApplicationException(Response.status(Response.Status.NOT_FOUND).type(MIMETYPE_TEXT).entity("Not found" + CRLF + StringUtils.stringifyException(exp) + CRLF).build());
    }
    if (exp instanceof NoSuchColumnFamilyException) {
        throw new WebApplicationException(Response.status(Response.Status.NOT_FOUND).type(MIMETYPE_TEXT).entity("Not found" + CRLF + StringUtils.stringifyException(exp) + CRLF).build());
    }
    if (exp instanceof RuntimeException) {
        throw new WebApplicationException(Response.status(Response.Status.BAD_REQUEST).type(MIMETYPE_TEXT).entity("Bad request" + CRLF + StringUtils.stringifyException(exp) + CRLF).build());
    }
    if (exp instanceof RetriesExhaustedWithDetailsException) {
        RetriesExhaustedWithDetailsException retryException = (RetriesExhaustedWithDetailsException) exp;
        processException(retryException.getCause(0));
    }
    throw new WebApplicationException(Response.status(Response.Status.SERVICE_UNAVAILABLE).type(MIMETYPE_TEXT).entity("Unavailable" + CRLF + StringUtils.stringifyException(exp) + CRLF).build());
}
Also used : TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) RetriesExhaustedWithDetailsException(org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException) WebApplicationException(javax.ws.rs.WebApplicationException) NoSuchColumnFamilyException(org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException)

Example 24 with TableNotFoundException

use of org.apache.hadoop.hbase.TableNotFoundException in project hbase by apache.

the class TestScannerWithBulkload method testBulkLoadWithParallelScan.

@Test
public void testBulkLoadWithParallelScan() throws Exception {
    final TableName tableName = TableName.valueOf(name.getMethodName());
    final long l = System.currentTimeMillis();
    final Admin admin = TEST_UTIL.getAdmin();
    createTable(admin, tableName);
    Scan scan = createScan();
    scan.setCaching(1);
    final Table table = init(admin, l, scan, tableName);
    // use bulkload
    final Path hfilePath = writeToHFile(l, "/temp/testBulkLoadWithParallelScan/", "/temp/testBulkLoadWithParallelScan/col/file", false);
    Configuration conf = TEST_UTIL.getConfiguration();
    conf.setBoolean("hbase.mapreduce.bulkload.assign.sequenceNumbers", true);
    final LoadIncrementalHFiles bulkload = new LoadIncrementalHFiles(conf);
    ResultScanner scanner = table.getScanner(scan);
    Result result = scanner.next();
    // Create a scanner and then do bulk load
    final CountDownLatch latch = new CountDownLatch(1);
    new Thread() {

        public void run() {
            try {
                Put put1 = new Put(Bytes.toBytes("row5"));
                put1.add(new KeyValue(Bytes.toBytes("row5"), Bytes.toBytes("col"), Bytes.toBytes("q"), l, Bytes.toBytes("version0")));
                table.put(put1);
                try (RegionLocator locator = TEST_UTIL.getConnection().getRegionLocator(tableName)) {
                    bulkload.doBulkLoad(hfilePath, admin, table, locator);
                }
                latch.countDown();
            } catch (TableNotFoundException e) {
            } catch (IOException e) {
            }
        }
    }.start();
    latch.await();
    // By the time we do next() the bulk loaded files are also added to the kv
    // scanner
    scanAfterBulkLoad(scanner, result, "version1");
    scanner.close();
    table.close();
}
Also used : Path(org.apache.hadoop.fs.Path) RegionLocator(org.apache.hadoop.hbase.client.RegionLocator) Table(org.apache.hadoop.hbase.client.Table) ResultScanner(org.apache.hadoop.hbase.client.ResultScanner) KeyValue(org.apache.hadoop.hbase.KeyValue) Configuration(org.apache.hadoop.conf.Configuration) IOException(java.io.IOException) Admin(org.apache.hadoop.hbase.client.Admin) CountDownLatch(java.util.concurrent.CountDownLatch) Put(org.apache.hadoop.hbase.client.Put) Result(org.apache.hadoop.hbase.client.Result) TableName(org.apache.hadoop.hbase.TableName) TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) LoadIncrementalHFiles(org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles) Scan(org.apache.hadoop.hbase.client.Scan) Test(org.junit.Test)

Example 25 with TableNotFoundException

use of org.apache.hadoop.hbase.TableNotFoundException in project hbase by apache.

the class MasterProcedureScheduler method completionCleanup.

@Override
public void completionCleanup(final Procedure proc) {
    if (proc instanceof TableProcedureInterface) {
        TableProcedureInterface iProcTable = (TableProcedureInterface) proc;
        boolean tableDeleted;
        if (proc.hasException()) {
            Exception procEx = proc.getException().unwrapRemoteException();
            if (iProcTable.getTableOperationType() == TableOperationType.CREATE) {
                // create failed because the table already exist
                tableDeleted = !(procEx instanceof TableExistsException);
            } else {
                // the operation failed because the table does not exist
                tableDeleted = (procEx instanceof TableNotFoundException);
            }
        } else {
            // the table was deleted
            tableDeleted = (iProcTable.getTableOperationType() == TableOperationType.DELETE);
        }
        if (tableDeleted) {
            markTableAsDeleted(iProcTable.getTableName(), proc);
            return;
        }
    } else {
        // No cleanup for ServerProcedureInterface types, yet.
        return;
    }
}
Also used : TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) TableExistsException(org.apache.hadoop.hbase.TableExistsException) TableNotFoundException(org.apache.hadoop.hbase.TableNotFoundException) TableExistsException(org.apache.hadoop.hbase.TableExistsException)

Aggregations

TableNotFoundException (org.apache.hadoop.hbase.TableNotFoundException)41 IOException (java.io.IOException)19 TableName (org.apache.hadoop.hbase.TableName)14 TableNotEnabledException (org.apache.hadoop.hbase.TableNotEnabledException)8 Test (org.junit.Test)8 HRegionInfo (org.apache.hadoop.hbase.HRegionInfo)7 HTableDescriptor (org.apache.hadoop.hbase.HTableDescriptor)7 ServerName (org.apache.hadoop.hbase.ServerName)6 ArrayList (java.util.ArrayList)5 HColumnDescriptor (org.apache.hadoop.hbase.HColumnDescriptor)5 Connection (org.apache.hadoop.hbase.client.Connection)5 Table (org.apache.hadoop.hbase.client.Table)5 Path (org.apache.hadoop.fs.Path)4 DoNotRetryIOException (org.apache.hadoop.hbase.DoNotRetryIOException)4 TableNotDisabledException (org.apache.hadoop.hbase.TableNotDisabledException)4 RegionLocator (org.apache.hadoop.hbase.client.RegionLocator)4 InterruptedIOException (java.io.InterruptedIOException)3 LinkedList (java.util.LinkedList)3 List (java.util.List)3 Map (java.util.Map)3