Search in sources :

Example 21 with Op

use of org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op in project hadoop by apache.

the class WebHdfsFileSystem method createNonRecursive.

@Override
public FSDataOutputStream createNonRecursive(final Path f, final FsPermission permission, final EnumSet<CreateFlag> flag, final int bufferSize, final short replication, final long blockSize, final Progressable progress) throws IOException {
    statistics.incrementWriteOps(1);
    storageStatistics.incrementOpCounter(OpType.CREATE_NON_RECURSIVE);
    final FsPermission modes = applyUMask(permission);
    final HttpOpParam.Op op = PutOpParam.Op.CREATE;
    return new FsPathOutputStreamRunner(op, f, bufferSize, new PermissionParam(modes.getMasked()), new UnmaskedPermissionParam(modes.getUnmasked()), new CreateFlagParam(flag), new CreateParentParam(false), new BufferSizeParam(bufferSize), new ReplicationParam(replication), new BlockSizeParam(blockSize)).run();
}
Also used : Op(org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op) FsPermission(org.apache.hadoop.fs.permission.FsPermission)

Example 22 with Op

use of org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op in project hadoop by apache.

the class WebHdfsFileSystem method getHomeDirectory.

@Override
public Path getHomeDirectory() {
    if (cachedHomeDirectory == null) {
        final HttpOpParam.Op op = GetOpParam.Op.GETHOMEDIRECTORY;
        try {
            String pathFromDelegatedFS = new FsPathResponseRunner<String>(op, null, new UserParam(ugi)) {

                @Override
                String decodeResponse(Map<?, ?> json) throws IOException {
                    return JsonUtilClient.getPath(json);
                }
            }.run();
            cachedHomeDirectory = new Path(pathFromDelegatedFS).makeQualified(this.getUri(), null);
        } catch (IOException e) {
            LOG.error("Unable to get HomeDirectory from original File System", e);
            cachedHomeDirectory = new Path("/user/" + ugi.getShortUserName()).makeQualified(this.getUri(), null);
        }
    }
    return cachedHomeDirectory;
}
Also used : Path(org.apache.hadoop.fs.Path) Op(org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op) IOException(java.io.IOException)

Example 23 with Op

use of org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op in project hadoop by apache.

the class WebHdfsFileSystem method setPermission.

@Override
public void setPermission(final Path p, final FsPermission permission) throws IOException {
    statistics.incrementWriteOps(1);
    storageStatistics.incrementOpCounter(OpType.SET_PERMISSION);
    final HttpOpParam.Op op = PutOpParam.Op.SETPERMISSION;
    new FsPathRunner(op, p, new PermissionParam(permission)).run();
}
Also used : Op(org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op)

Example 24 with Op

use of org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op in project hadoop by apache.

the class WebHdfsFileSystem method rename.

@SuppressWarnings("deprecation")
@Override
public void rename(final Path src, final Path dst, final Options.Rename... options) throws IOException {
    statistics.incrementWriteOps(1);
    storageStatistics.incrementOpCounter(OpType.RENAME);
    final HttpOpParam.Op op = PutOpParam.Op.RENAME;
    new FsPathRunner(op, src, new DestinationParam(makeQualified(dst).toUri().getPath()), new RenameOptionSetParam(options)).run();
}
Also used : Op(org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op)

Example 25 with Op

use of org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op in project hadoop by apache.

the class WebHdfsFileSystem method setAcl.

@Override
public void setAcl(final Path p, final List<AclEntry> aclSpec) throws IOException {
    statistics.incrementWriteOps(1);
    storageStatistics.incrementOpCounter(OpType.SET_ACL);
    final HttpOpParam.Op op = PutOpParam.Op.SETACL;
    new FsPathRunner(op, p, new AclPermissionParam(aclSpec)).run();
}
Also used : Op(org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op)

Aggregations

Op (org.apache.hadoop.hdfs.web.resources.HttpOpParam.Op)40 IOException (java.io.IOException)8 Path (org.apache.hadoop.fs.Path)3 FsPermission (org.apache.hadoop.fs.permission.FsPermission)3 FileNotFoundException (java.io.FileNotFoundException)2 Map (java.util.Map)1 ContentSummary (org.apache.hadoop.fs.ContentSummary)1 MD5MD5CRC32FileChecksum (org.apache.hadoop.fs.MD5MD5CRC32FileChecksum)1 AclStatus (org.apache.hadoop.fs.permission.AclStatus)1 HdfsFileStatus (org.apache.hadoop.hdfs.protocol.HdfsFileStatus)1 DelegationTokenIdentifier (org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenIdentifier)1 AccessControlException (org.apache.hadoop.security.AccessControlException)1 InvalidToken (org.apache.hadoop.security.token.SecretManager.InvalidToken)1 Token (org.apache.hadoop.security.token.Token)1