Search in sources :

Example 1 with UnresolvedLinkException

use of org.apache.hadoop.fs.UnresolvedLinkException in project hadoop by apache.

the class DistributedFileSystem method getQuotaUsage.

@Override
public QuotaUsage getQuotaUsage(Path f) throws IOException {
    statistics.incrementReadOps(1);
    storageStatistics.incrementOpCounter(OpType.GET_QUOTA_USAGE);
    Path absF = fixRelativePart(f);
    return new FileSystemLinkResolver<QuotaUsage>() {

        @Override
        public QuotaUsage doCall(final Path p) throws IOException, UnresolvedLinkException {
            return dfs.getQuotaUsage(getPathName(p));
        }

        @Override
        public QuotaUsage next(final FileSystem fs, final Path p) throws IOException {
            return fs.getQuotaUsage(p);
        }
    }.resolve(this, absF);
}
Also used : Path(org.apache.hadoop.fs.Path) UnresolvedLinkException(org.apache.hadoop.fs.UnresolvedLinkException) FileSystem(org.apache.hadoop.fs.FileSystem) QuotaUsage(org.apache.hadoop.fs.QuotaUsage) IOException(java.io.IOException)

Example 2 with UnresolvedLinkException

use of org.apache.hadoop.fs.UnresolvedLinkException in project hadoop by apache.

the class DistributedFileSystem method rename.

@SuppressWarnings("deprecation")
@Override
public boolean rename(Path src, Path dst) throws IOException {
    statistics.incrementWriteOps(1);
    storageStatistics.incrementOpCounter(OpType.RENAME);
    final Path absSrc = fixRelativePart(src);
    final Path absDst = fixRelativePart(dst);
    // Try the rename without resolving first
    try {
        return dfs.rename(getPathName(absSrc), getPathName(absDst));
    } catch (UnresolvedLinkException e) {
        // Fully resolve the source
        final Path source = getFileLinkStatus(absSrc).getPath();
        // Keep trying to resolve the destination
        return new FileSystemLinkResolver<Boolean>() {

            @Override
            public Boolean doCall(final Path p) throws IOException {
                return dfs.rename(getPathName(source), getPathName(p));
            }

            @Override
            public Boolean next(final FileSystem fs, final Path p) throws IOException {
                // Should just throw an error in FileSystem#checkPath
                return doCall(p);
            }
        }.resolve(this, absDst);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) UnresolvedLinkException(org.apache.hadoop.fs.UnresolvedLinkException) FileSystem(org.apache.hadoop.fs.FileSystem) FileSystemLinkResolver(org.apache.hadoop.fs.FileSystemLinkResolver)

Example 3 with UnresolvedLinkException

use of org.apache.hadoop.fs.UnresolvedLinkException in project incubator-crail by apache.

the class CrailHDFS method getFileStatus.

@Override
public FileStatus getFileStatus(Path path) throws AccessControlException, FileNotFoundException, UnresolvedLinkException, IOException {
    CrailNode directFile = null;
    try {
        directFile = dfs.lookup(path.toUri().getRawPath()).get();
    } catch (Exception e) {
        throw new IOException(e);
    }
    if (directFile == null) {
        throw new FileNotFoundException("filename " + path);
    }
    FsPermission permission = FsPermission.getFileDefault();
    if (directFile.getType().isDirectory()) {
        permission = FsPermission.getDirDefault();
    }
    FileStatus status = new FileStatus(directFile.getCapacity(), directFile.getType().isContainer(), CrailConstants.SHADOW_REPLICATION, CrailConstants.BLOCK_SIZE, directFile.getModificationTime(), directFile.getModificationTime(), permission, CrailConstants.USER, CrailConstants.USER, path.makeQualified(this.getUri(), this.workingDir));
    return status;
}
Also used : FileStatus(org.apache.hadoop.fs.FileStatus) CrailNode(org.apache.crail.CrailNode) FileNotFoundException(java.io.FileNotFoundException) IOException(java.io.IOException) FsPermission(org.apache.hadoop.fs.permission.FsPermission) URISyntaxException(java.net.URISyntaxException) UnresolvedLinkException(org.apache.hadoop.fs.UnresolvedLinkException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException) UnsupportedFileSystemException(org.apache.hadoop.fs.UnsupportedFileSystemException)

Example 4 with UnresolvedLinkException

use of org.apache.hadoop.fs.UnresolvedLinkException in project incubator-crail by apache.

the class CrailHDFS method open.

@Override
public FSDataInputStream open(Path path, int bufferSize) throws AccessControlException, FileNotFoundException, UnresolvedLinkException, IOException {
    CrailFile fileInfo = null;
    try {
        fileInfo = dfs.lookup(path.toUri().getRawPath()).get().asFile();
    } catch (Exception e) {
        throw new IOException(e);
    }
    CrailBufferedInputStream inputStream = null;
    if (fileInfo != null) {
        try {
            inputStream = fileInfo.getBufferedInputStream(fileInfo.getCapacity());
        } catch (Exception e) {
            throw new IOException(e);
        }
    }
    if (inputStream != null) {
        return new CrailHDFSInputStream(inputStream);
    } else {
        throw new IOException("Failed to open file, path " + path.toString());
    }
}
Also used : CrailFile(org.apache.crail.CrailFile) IOException(java.io.IOException) CrailBufferedInputStream(org.apache.crail.CrailBufferedInputStream) URISyntaxException(java.net.URISyntaxException) UnresolvedLinkException(org.apache.hadoop.fs.UnresolvedLinkException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException) UnsupportedFileSystemException(org.apache.hadoop.fs.UnsupportedFileSystemException)

Example 5 with UnresolvedLinkException

use of org.apache.hadoop.fs.UnresolvedLinkException in project incubator-crail by apache.

the class CrailHDFS method listStatus.

@Override
public FileStatus[] listStatus(Path path) throws AccessControlException, FileNotFoundException, UnresolvedLinkException, IOException {
    try {
        CrailNode node = dfs.lookup(path.toUri().getRawPath()).get();
        Iterator<String> iter = node.asContainer().listEntries();
        ArrayList<FileStatus> statusList = new ArrayList<FileStatus>();
        while (iter.hasNext()) {
            String filepath = iter.next();
            CrailNode directFile = dfs.lookup(filepath).get();
            if (directFile != null) {
                FsPermission permission = FsPermission.getFileDefault();
                if (directFile.getType().isDirectory()) {
                    permission = FsPermission.getDirDefault();
                }
                FileStatus status = new FileStatus(directFile.getCapacity(), directFile.getType().isContainer(), CrailConstants.SHADOW_REPLICATION, CrailConstants.BLOCK_SIZE, directFile.getModificationTime(), directFile.getModificationTime(), permission, CrailConstants.USER, CrailConstants.USER, new Path(filepath).makeQualified(this.getUri(), workingDir));
                statusList.add(status);
            }
        }
        FileStatus[] list = new FileStatus[statusList.size()];
        statusList.toArray(list);
        return list;
    } catch (Exception e) {
        throw new FileNotFoundException(path.toUri().getRawPath());
    }
}
Also used : Path(org.apache.hadoop.fs.Path) FileStatus(org.apache.hadoop.fs.FileStatus) CrailNode(org.apache.crail.CrailNode) ArrayList(java.util.ArrayList) FileNotFoundException(java.io.FileNotFoundException) FsPermission(org.apache.hadoop.fs.permission.FsPermission) URISyntaxException(java.net.URISyntaxException) UnresolvedLinkException(org.apache.hadoop.fs.UnresolvedLinkException) ParentNotDirectoryException(org.apache.hadoop.fs.ParentNotDirectoryException) FileAlreadyExistsException(org.apache.hadoop.fs.FileAlreadyExistsException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException) UnsupportedFileSystemException(org.apache.hadoop.fs.UnsupportedFileSystemException)

Aggregations

UnresolvedLinkException (org.apache.hadoop.fs.UnresolvedLinkException)15 IOException (java.io.IOException)13 Path (org.apache.hadoop.fs.Path)11 FileNotFoundException (java.io.FileNotFoundException)7 FileSystem (org.apache.hadoop.fs.FileSystem)7 URISyntaxException (java.net.URISyntaxException)6 FileAlreadyExistsException (org.apache.hadoop.fs.FileAlreadyExistsException)6 ParentNotDirectoryException (org.apache.hadoop.fs.ParentNotDirectoryException)6 UnsupportedFileSystemException (org.apache.hadoop.fs.UnsupportedFileSystemException)6 AccessControlException (org.apache.hadoop.security.AccessControlException)6 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)4 FileStatus (org.apache.hadoop.fs.FileStatus)3 CrailFile (org.apache.crail.CrailFile)2 CrailNode (org.apache.crail.CrailNode)2 FileSystemLinkResolver (org.apache.hadoop.fs.FileSystemLinkResolver)2 FsPermission (org.apache.hadoop.fs.permission.FsPermission)2 HdfsFileStatus (org.apache.hadoop.hdfs.protocol.HdfsFileStatus)2 ArrayList (java.util.ArrayList)1 CrailBlockLocation (org.apache.crail.CrailBlockLocation)1 CrailBufferedInputStream (org.apache.crail.CrailBufferedInputStream)1