Search in sources :

Example 1 with ParentNotDirectoryException

use of org.apache.hadoop.fs.ParentNotDirectoryException in project hadoop by apache.

the class FSDirectory method resolvePath.

/**
   * Resolves a given path into an INodesInPath.  All ancestor inodes that
   * exist are validated as traversable directories.  Symlinks in the ancestry
   * will generate an UnresolvedLinkException.  The returned IIP will be an
   * accessible path that also passed additional sanity checks based on how
   * the path will be used as specified by the DirOp.
   *   READ:   Expands reserved paths and performs permission checks
   *           during traversal.  Raw paths are only accessible by a superuser.
   *   WRITE:  In addition to READ checks, ensures the path is not a
   *           snapshot path.
   *   CREATE: In addition to WRITE checks, ensures path does not contain
   *           illegal character sequences.
   *
   * @param pc  A permission checker for traversal checks.  Pass null for
   *            no permission checks.
   * @param src The path to resolve.
   * @param dirOp The {@link DirOp} that controls additional checks.
   * @param resolveLink If false, only ancestor symlinks will be checked.  If
   *         true, the last inode will also be checked.
   * @return if the path indicates an inode, return path after replacing up to
   *         <inodeid> with the corresponding path of the inode, else the path
   *         in {@code src} as is. If the path refers to a path in the "raw"
   *         directory, return the non-raw pathname.
   * @throws FileNotFoundException
   * @throws AccessControlException
   * @throws ParentNotDirectoryException
   * @throws UnresolvedLinkException
   */
@VisibleForTesting
public INodesInPath resolvePath(FSPermissionChecker pc, String src, DirOp dirOp) throws UnresolvedLinkException, FileNotFoundException, AccessControlException, ParentNotDirectoryException {
    boolean isCreate = (dirOp == DirOp.CREATE || dirOp == DirOp.CREATE_LINK);
    // prevent creation of new invalid paths
    if (isCreate && !DFSUtil.isValidName(src)) {
        throw new InvalidPathException("Invalid file name: " + src);
    }
    byte[][] components = INode.getPathComponents(src);
    boolean isRaw = isReservedRawName(components);
    if (isPermissionEnabled && pc != null && isRaw) {
        pc.checkSuperuserPrivilege();
    }
    components = resolveComponents(components, this);
    INodesInPath iip = INodesInPath.resolve(rootDir, components, isRaw);
    // PNDE
    try {
        checkTraverse(pc, iip, dirOp);
    } catch (ParentNotDirectoryException pnde) {
        if (!isCreate) {
            throw new AccessControlException(pnde.getMessage());
        }
        throw pnde;
    }
    return iip;
}
Also used : ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) AccessControlException(org.apache.hadoop.security.AccessControlException) SnapshotAccessControlException(org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException) InvalidPathException(org.apache.hadoop.fs.InvalidPathException) VisibleForTesting(com.google.common.annotations.VisibleForTesting)

Example 2 with ParentNotDirectoryException

use of org.apache.hadoop.fs.ParentNotDirectoryException in project hadoop by apache.

the class AbstractContractMkdirTest method testNoMkdirOverFile.

@Test
public void testNoMkdirOverFile() throws Throwable {
    describe("try to mkdir over a file");
    FileSystem fs = getFileSystem();
    Path path = path("testNoMkdirOverFile");
    byte[] dataset = dataset(1024, ' ', 'z');
    createFile(getFileSystem(), path, false, dataset);
    try {
        boolean made = fs.mkdirs(path);
        fail("mkdirs did not fail over a file but returned " + made + "; " + ls(path));
    } catch (ParentNotDirectoryException | FileAlreadyExistsException e) {
        //parent is a directory
        handleExpectedException(e);
    } catch (IOException e) {
        //here the FS says "no create"
        handleRelaxedException("mkdirs", "FileAlreadyExistsException", e);
    }
    assertIsFile(path);
    byte[] bytes = ContractTestUtils.readDataset(getFileSystem(), path, dataset.length);
    ContractTestUtils.compareByteArrays(dataset, bytes, dataset.length);
    assertPathExists("mkdir failed", path);
    assertDeleted(path, true);
}
Also used : Path(org.apache.hadoop.fs.Path) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) FileSystem(org.apache.hadoop.fs.FileSystem) IOException(java.io.IOException) Test(org.junit.Test)

Example 3 with ParentNotDirectoryException

use of org.apache.hadoop.fs.ParentNotDirectoryException in project hadoop by apache.

the class AbstractContractMkdirTest method testMkdirOverParentFile.

@Test
public void testMkdirOverParentFile() throws Throwable {
    describe("try to mkdir where a parent is a file");
    FileSystem fs = getFileSystem();
    Path path = path("testMkdirOverParentFile");
    byte[] dataset = dataset(1024, ' ', 'z');
    createFile(getFileSystem(), path, false, dataset);
    Path child = new Path(path, "child-to-mkdir");
    try {
        boolean made = fs.mkdirs(child);
        fail("mkdirs did not fail over a file but returned " + made + "; " + ls(path));
    } catch (ParentNotDirectoryException | FileAlreadyExistsException e) {
        //parent is a directory
        handleExpectedException(e);
    } catch (IOException e) {
        handleRelaxedException("mkdirs", "ParentNotDirectoryException", e);
    }
    assertIsFile(path);
    byte[] bytes = ContractTestUtils.readDataset(getFileSystem(), path, dataset.length);
    ContractTestUtils.compareByteArrays(dataset, bytes, dataset.length);
    assertPathExists("mkdir failed", path);
    assertDeleted(path, true);
}
Also used : Path(org.apache.hadoop.fs.Path) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) FileSystem(org.apache.hadoop.fs.FileSystem) IOException(java.io.IOException) Test(org.junit.Test)

Example 4 with ParentNotDirectoryException

use of org.apache.hadoop.fs.ParentNotDirectoryException in project hadoop by apache.

the class TestSwiftFileSystemContract method testMkdirsFailsForSubdirectoryOfExistingFile.

@Override
public void testMkdirsFailsForSubdirectoryOfExistingFile() throws Exception {
    Path testDir = path("/test/hadoop");
    assertFalse(fs.exists(testDir));
    assertTrue(fs.mkdirs(testDir));
    assertTrue(fs.exists(testDir));
    Path filepath = path("/test/hadoop/file");
    SwiftTestUtils.writeTextFile(fs, filepath, "hello, world", false);
    Path testSubDir = new Path(filepath, "subdir");
    SwiftTestUtils.assertPathDoesNotExist(fs, "subdir before mkdir", testSubDir);
    try {
        fs.mkdirs(testSubDir);
        fail("Should throw IOException.");
    } catch (ParentNotDirectoryException e) {
    // expected
    }
    //now verify that the subdir path does not exist
    SwiftTestUtils.assertPathDoesNotExist(fs, "subdir after mkdir", testSubDir);
    Path testDeepSubDir = path("/test/hadoop/file/deep/sub/dir");
    try {
        fs.mkdirs(testDeepSubDir);
        fail("Should throw IOException.");
    } catch (ParentNotDirectoryException e) {
    // expected
    }
    SwiftTestUtils.assertPathDoesNotExist(fs, "testDeepSubDir  after mkdir", testDeepSubDir);
}
Also used : Path(org.apache.hadoop.fs.Path) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException)

Example 5 with ParentNotDirectoryException

use of org.apache.hadoop.fs.ParentNotDirectoryException in project hadoop by apache.

the class SwiftNativeFileSystem method shouldCreate.

/**
   * Should mkdir create this directory?
   * If the directory is root : false
   * If the entry exists and is a directory: false
   * If the entry exists and is a file: exception
   * else: true
   * @param directory path to query
   * @return true iff the directory should be created
   * @throws IOException IO problems
   * @throws ParentNotDirectoryException if the path references a file
   */
private boolean shouldCreate(Path directory) throws IOException {
    FileStatus fileStatus;
    boolean shouldCreate;
    if (isRoot(directory)) {
        //its the base dir, bail out immediately
        return false;
    }
    try {
        //find out about the path
        fileStatus = getFileStatus(directory);
        if (!SwiftUtils.isDirectory(fileStatus)) {
            //if it's a file, raise an error
            throw new ParentNotDirectoryException(String.format("%s: can't mkdir since it exists and is not a directory: %s", directory, fileStatus));
        } else {
            //path exists, and it is a directory
            if (LOG.isDebugEnabled()) {
                LOG.debug("skipping mkdir(" + directory + ") as it exists already");
            }
            shouldCreate = false;
        }
    } catch (FileNotFoundException e) {
        shouldCreate = true;
    }
    return shouldCreate;
}
Also used : FileStatus(org.apache.hadoop.fs.FileStatus) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) FileNotFoundException(java.io.FileNotFoundException)

Aggregations

ParentNotDirectoryException (org.apache.hadoop.fs.ParentNotDirectoryException)12 Path (org.apache.hadoop.fs.Path)8 FileNotFoundException (java.io.FileNotFoundException)6 IOException (java.io.IOException)6 Test (org.junit.Test)5 FileAlreadyExistsException (org.apache.hadoop.fs.FileAlreadyExistsException)4 AccessControlException (org.apache.hadoop.security.AccessControlException)3 FileSystem (org.apache.hadoop.fs.FileSystem)2 VisibleForTesting (com.google.common.annotations.VisibleForTesting)1 Path (java.nio.file.Path)1 ArrayList (java.util.ArrayList)1 Configuration (org.apache.hadoop.conf.Configuration)1 CreateFlag (org.apache.hadoop.fs.CreateFlag)1 FileStatus (org.apache.hadoop.fs.FileStatus)1 InvalidPathException (org.apache.hadoop.fs.InvalidPathException)1 FsPermission (org.apache.hadoop.fs.permission.FsPermission)1 SnapshotAccessControlException (org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException)1 BlockStoragePolicySuite (org.apache.hadoop.hdfs.server.blockmanagement.BlockStoragePolicySuite)1 ChunkedArrayList (org.apache.hadoop.util.ChunkedArrayList)1