Search in sources :

Example 96 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class RPCUtil method unwrapAndThrowException.

/**
   * Utility method that unwraps and returns appropriate exceptions.
   * 
   * @param se
   *          ServiceException
   * @return An instance of the actual exception, which will be a subclass of
   *         {@link YarnException} or {@link IOException}
   */
public static Void unwrapAndThrowException(ServiceException se) throws IOException, YarnException {
    Throwable cause = se.getCause();
    if (cause == null) {
        // SE generated by the RPC layer itself.
        throw new IOException(se);
    } else {
        if (cause instanceof RemoteException) {
            RemoteException re = (RemoteException) cause;
            Class<?> realClass = null;
            try {
                realClass = Class.forName(re.getClassName());
            } catch (ClassNotFoundException cnf) {
                // well.
                throw instantiateYarnException(YarnException.class, re);
            }
            if (YarnException.class.isAssignableFrom(realClass)) {
                throw instantiateYarnException(realClass.asSubclass(YarnException.class), re);
            } else if (IOException.class.isAssignableFrom(realClass)) {
                throw instantiateIOException(realClass.asSubclass(IOException.class), re);
            } else if (RuntimeException.class.isAssignableFrom(realClass)) {
                throw instantiateRuntimeException(realClass.asSubclass(RuntimeException.class), re);
            } else {
                throw re;
            }
        // RemoteException contains useful information as against the
        // java.lang.reflect exceptions.
        } else if (cause instanceof IOException) {
            // RPC Client exception.
            throw (IOException) cause;
        } else if (cause instanceof RuntimeException) {
            // RPC RuntimeException
            throw (RuntimeException) cause;
        } else {
            // Should not be generated.
            throw new IOException(se);
        }
    }
}
Also used : IOException(java.io.IOException) RemoteException(org.apache.hadoop.ipc.RemoteException) YarnException(org.apache.hadoop.yarn.exceptions.YarnException)

Example 97 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class TestRPCUtil method verifyRemoteExceptionUnwrapping.

private void verifyRemoteExceptionUnwrapping(Class<? extends Throwable> expectedLocalException, String realExceptionClassName) {
    String message = realExceptionClassName + "Message";
    RemoteException re = new RemoteException(realExceptionClassName, message);
    ServiceException se = new ServiceException(re);
    Throwable t = null;
    try {
        RPCUtil.unwrapAndThrowException(se);
    } catch (Throwable thrown) {
        t = thrown;
    }
    Assert.assertTrue("Expected exception [" + expectedLocalException + "] but found " + t, expectedLocalException.isInstance(t));
    Assert.assertTrue("Expected message [" + message + "] but found " + t.getMessage(), t.getMessage().contains(message));
}
Also used : ServiceException(com.google.protobuf.ServiceException) RemoteException(org.apache.hadoop.ipc.RemoteException)

Example 98 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project SSM by Intel-bigdata.

the class SmartServer method checkAndMarkRunning.

private OutputStream checkAndMarkRunning() throws IOException {
    try {
        if (fs.exists(SSM_ID_PATH)) {
            // try appending to it so that it will fail fast if another balancer is
            // running.
            IOUtils.closeStream(fs.append(SSM_ID_PATH));
            fs.delete(SSM_ID_PATH, true);
        }
        final FSDataOutputStream fsout = fs.create(SSM_ID_PATH, false);
        // mark balancer idPath to be deleted during filesystem closure
        fs.deleteOnExit(SSM_ID_PATH);
        fsout.writeBytes(InetAddress.getLocalHost().getHostName());
        fsout.hflush();
        return fsout;
    } catch (RemoteException e) {
        if (AlreadyBeingCreatedException.class.getName().equals(e.getClassName())) {
            return null;
        } else {
            throw e;
        }
    }
}
Also used : FSDataOutputStream(org.apache.hadoop.fs.FSDataOutputStream) RemoteException(org.apache.hadoop.ipc.RemoteException)

Example 99 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project phoenix by apache.

the class MetaDataUtil method tableRegionsOnline.

/**
     * This function checks if all regions of a table is online
     * @param table
     * @return true when all regions of a table are online
     * @throws IOException
     * @throws
     */
public static boolean tableRegionsOnline(Configuration conf, PTable table) {
    HConnection hcon = null;
    try {
        hcon = HConnectionManager.getConnection(conf);
        List<HRegionLocation> locations = hcon.locateRegions(org.apache.hadoop.hbase.TableName.valueOf(table.getTableName().getBytes()));
        for (HRegionLocation loc : locations) {
            try {
                ServerName sn = loc.getServerName();
                if (sn == null)
                    continue;
                AdminService.BlockingInterface admin = hcon.getAdmin(sn);
                GetRegionInfoRequest request = RequestConverter.buildGetRegionInfoRequest(loc.getRegionInfo().getRegionName());
                admin.getRegionInfo(null, request);
            } catch (ServiceException e) {
                IOException ie = ProtobufUtil.getRemoteException(e);
                logger.debug("Region " + loc.getRegionInfo().getEncodedName() + " isn't online due to:" + ie);
                return false;
            } catch (RemoteException e) {
                logger.debug("Cannot get region " + loc.getRegionInfo().getEncodedName() + " info due to error:" + e);
                return false;
            }
        }
    } catch (IOException ex) {
        logger.warn("tableRegionsOnline failed due to:" + ex);
        return false;
    } finally {
        if (hcon != null) {
            try {
                hcon.close();
            } catch (IOException ignored) {
            }
        }
    }
    return true;
}
Also used : HRegionLocation(org.apache.hadoop.hbase.HRegionLocation) AdminService(org.apache.hadoop.hbase.protobuf.generated.AdminProtos.AdminService) GetRegionInfoRequest(org.apache.hadoop.hbase.protobuf.generated.AdminProtos.GetRegionInfoRequest) ServiceException(com.google.protobuf.ServiceException) ServerName(org.apache.hadoop.hbase.ServerName) IOException(java.io.IOException) RemoteException(org.apache.hadoop.ipc.RemoteException) HConnection(org.apache.hadoop.hbase.client.HConnection)

Aggregations

RemoteException (org.apache.hadoop.ipc.RemoteException)99 IOException (java.io.IOException)53 Test (org.junit.Test)39 Path (org.apache.hadoop.fs.Path)36 Configuration (org.apache.hadoop.conf.Configuration)20 FileNotFoundException (java.io.FileNotFoundException)19 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)13 FileSystem (org.apache.hadoop.fs.FileSystem)12 InterruptedIOException (java.io.InterruptedIOException)10 AccessControlException (org.apache.hadoop.security.AccessControlException)10 ServerName (org.apache.hadoop.hbase.ServerName)9 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)8 HdfsConfiguration (org.apache.hadoop.hdfs.HdfsConfiguration)8 FileAlreadyExistsException (org.apache.hadoop.fs.FileAlreadyExistsException)7 HRegionInfo (org.apache.hadoop.hbase.HRegionInfo)7 MiniDFSCluster (org.apache.hadoop.hdfs.MiniDFSCluster)7 EOFException (java.io.EOFException)6 ArrayList (java.util.ArrayList)6 DoNotRetryIOException (org.apache.hadoop.hbase.DoNotRetryIOException)6 HBaseIOException (org.apache.hadoop.hbase.HBaseIOException)6