Search in sources :

Example 16 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class TestSnapshotNameWithInvalidCharacters method TestSnapshotWithInvalidName.

@Test(timeout = 600000)
public void TestSnapshotWithInvalidName() throws Exception {
    Path file1 = new Path(dir1, file1Name);
    DFSTestUtil.createFile(hdfs, file1, BLOCKSIZE, REPLICATION, SEED);
    hdfs.allowSnapshot(dir1);
    try {
        hdfs.createSnapshot(dir1, snapshot1);
    } catch (RemoteException e) {
    }
}
Also used : Path(org.apache.hadoop.fs.Path) RemoteException(org.apache.hadoop.ipc.RemoteException) Test(org.junit.Test)

Example 17 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class TestSnapshotNameWithInvalidCharacters method TestSnapshotWithInvalidName1.

@Test(timeout = 60000)
public void TestSnapshotWithInvalidName1() throws Exception {
    Path file1 = new Path(dir1, file1Name);
    DFSTestUtil.createFile(hdfs, file1, BLOCKSIZE, REPLICATION, SEED);
    hdfs.allowSnapshot(dir1);
    try {
        hdfs.createSnapshot(dir1, snapshot2);
    } catch (RemoteException e) {
    }
}
Also used : Path(org.apache.hadoop.fs.Path) RemoteException(org.apache.hadoop.ipc.RemoteException) Test(org.junit.Test)

Example 18 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class DFSClient method createSymlink.

/**
   * Creates a symbolic link.
   *
   * @see ClientProtocol#createSymlink(String, String,FsPermission, boolean)
   */
public void createSymlink(String target, String link, boolean createParent) throws IOException {
    checkOpen();
    try (TraceScope ignored = newPathTraceScope("createSymlink", target)) {
        final FsPermission dirPerm = applyUMask(null);
        namenode.createSymlink(target, link, dirPerm, createParent);
    } catch (RemoteException re) {
        throw re.unwrapRemoteException(AccessControlException.class, FileAlreadyExistsException.class, FileNotFoundException.class, ParentNotDirectoryException.class, NSQuotaExceededException.class, DSQuotaExceededException.class, QuotaByStorageTypeExceededException.class, UnresolvedPathException.class, SnapshotAccessControlException.class);
    }
}
Also used : QuotaByStorageTypeExceededException(org.apache.hadoop.hdfs.protocol.QuotaByStorageTypeExceededException) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) DSQuotaExceededException(org.apache.hadoop.hdfs.protocol.DSQuotaExceededException) TraceScope(org.apache.htrace.core.TraceScope) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) NSQuotaExceededException(org.apache.hadoop.hdfs.protocol.NSQuotaExceededException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) FsPermission(org.apache.hadoop.fs.permission.FsPermission) RemoteException(org.apache.hadoop.ipc.RemoteException) UnresolvedPathException(org.apache.hadoop.hdfs.protocol.UnresolvedPathException)

Example 19 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class DFSClient method primitiveMkdir.

/**
   * Same {{@link #mkdirs(String, FsPermission, boolean)} except
   * that the permissions has already been masked against umask.
   */
public boolean primitiveMkdir(String src, FsPermission absPermission, boolean createParent) throws IOException {
    checkOpen();
    if (absPermission == null) {
        absPermission = applyUMaskDir(null);
    }
    LOG.debug("{}: masked={}", src, absPermission);
    try (TraceScope ignored = tracer.newScope("mkdir")) {
        return namenode.mkdirs(src, absPermission, createParent);
    } catch (RemoteException re) {
        throw re.unwrapRemoteException(AccessControlException.class, InvalidPathException.class, FileAlreadyExistsException.class, FileNotFoundException.class, ParentNotDirectoryException.class, SafeModeException.class, NSQuotaExceededException.class, DSQuotaExceededException.class, QuotaByStorageTypeExceededException.class, UnresolvedPathException.class, SnapshotAccessControlException.class);
    }
}
Also used : QuotaByStorageTypeExceededException(org.apache.hadoop.hdfs.protocol.QuotaByStorageTypeExceededException) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) TraceScope(org.apache.htrace.core.TraceScope) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) InvalidPathException(org.apache.hadoop.fs.InvalidPathException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) DSQuotaExceededException(org.apache.hadoop.hdfs.protocol.DSQuotaExceededException) SafeModeException(org.apache.hadoop.hdfs.server.namenode.SafeModeException) NSQuotaExceededException(org.apache.hadoop.hdfs.protocol.NSQuotaExceededException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) RemoteException(org.apache.hadoop.ipc.RemoteException) UnresolvedPathException(org.apache.hadoop.hdfs.protocol.UnresolvedPathException)

Example 20 with RemoteException

use of org.apache.hadoop.ipc.RemoteException in project hadoop by apache.

the class DFSClient method callAppend.

/** Method to get stream returned by append call */
private DFSOutputStream callAppend(String src, EnumSet<CreateFlag> flag, Progressable progress, String[] favoredNodes) throws IOException {
    CreateFlag.validateForAppend(flag);
    try {
        final LastBlockWithStatus blkWithStatus = callAppend(src, new EnumSetWritable<>(flag, CreateFlag.class));
        HdfsFileStatus status = blkWithStatus.getFileStatus();
        if (status == null) {
            LOG.debug("NameNode is on an older version, request file " + "info with additional RPC call for file: {}", src);
            status = getFileInfo(src);
        }
        return DFSOutputStream.newStreamForAppend(this, src, flag, progress, blkWithStatus.getLastBlock(), status, dfsClientConf.createChecksum(null), favoredNodes);
    } catch (RemoteException re) {
        throw re.unwrapRemoteException(AccessControlException.class, FileNotFoundException.class, SafeModeException.class, DSQuotaExceededException.class, QuotaByStorageTypeExceededException.class, UnsupportedOperationException.class, UnresolvedPathException.class, SnapshotAccessControlException.class);
    }
}
Also used : CreateFlag(org.apache.hadoop.fs.CreateFlag) QuotaByStorageTypeExceededException(org.apache.hadoop.hdfs.protocol.QuotaByStorageTypeExceededException) LastBlockWithStatus(org.apache.hadoop.hdfs.protocol.LastBlockWithStatus) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) HdfsFileStatus(org.apache.hadoop.hdfs.protocol.HdfsFileStatus) DSQuotaExceededException(org.apache.hadoop.hdfs.protocol.DSQuotaExceededException) SafeModeException(org.apache.hadoop.hdfs.server.namenode.SafeModeException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) RemoteException(org.apache.hadoop.ipc.RemoteException) UnresolvedPathException(org.apache.hadoop.hdfs.protocol.UnresolvedPathException)

Aggregations

RemoteException (org.apache.hadoop.ipc.RemoteException)99 IOException (java.io.IOException)53 Test (org.junit.Test)39 Path (org.apache.hadoop.fs.Path)36 Configuration (org.apache.hadoop.conf.Configuration)20 FileNotFoundException (java.io.FileNotFoundException)19 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)13 FileSystem (org.apache.hadoop.fs.FileSystem)12 InterruptedIOException (java.io.InterruptedIOException)10 AccessControlException (org.apache.hadoop.security.AccessControlException)10 ServerName (org.apache.hadoop.hbase.ServerName)9 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)8 HdfsConfiguration (org.apache.hadoop.hdfs.HdfsConfiguration)8 FileAlreadyExistsException (org.apache.hadoop.fs.FileAlreadyExistsException)7 HRegionInfo (org.apache.hadoop.hbase.HRegionInfo)7 MiniDFSCluster (org.apache.hadoop.hdfs.MiniDFSCluster)7 EOFException (java.io.EOFException)6 ArrayList (java.util.ArrayList)6 DoNotRetryIOException (org.apache.hadoop.hbase.DoNotRetryIOException)6 HBaseIOException (org.apache.hadoop.hbase.HBaseIOException)6