Search in sources :

Example 1 with TraceScope

use of org.apache.htrace.core.TraceScope in project hadoop by apache.

the class Globber method glob.

public FileStatus[] glob() throws IOException {
    TraceScope scope = tracer.newScope("Globber#glob");
    scope.addKVAnnotation("pattern", pathPattern.toUri().getPath());
    try {
        return doGlob();
    } finally {
        scope.close();
    }
}
Also used : TraceScope(org.apache.htrace.core.TraceScope)

Example 2 with TraceScope

use of org.apache.htrace.core.TraceScope in project hadoop by apache.

the class FsShell method run.

/**
   * run
   */
@Override
public int run(String[] argv) throws Exception {
    // initialize FsShell
    init();
    Tracer tracer = new Tracer.Builder("FsShell").conf(TraceUtils.wrapHadoopConf(SHELL_HTRACE_PREFIX, getConf())).build();
    int exitCode = -1;
    if (argv.length < 1) {
        printUsage(System.err);
    } else {
        String cmd = argv[0];
        Command instance = null;
        try {
            instance = commandFactory.getInstance(cmd);
            if (instance == null) {
                throw new UnknownCommandException();
            }
            TraceScope scope = tracer.newScope(instance.getCommandName());
            if (scope.getSpan() != null) {
                String args = StringUtils.join(" ", argv);
                if (args.length() > 2048) {
                    args = args.substring(0, 2048);
                }
                scope.getSpan().addKVAnnotation("args", args);
            }
            try {
                exitCode = instance.run(Arrays.copyOfRange(argv, 1, argv.length));
            } finally {
                scope.close();
            }
        } catch (IllegalArgumentException e) {
            if (e.getMessage() == null) {
                displayError(cmd, "Null exception message");
                e.printStackTrace(System.err);
            } else {
                displayError(cmd, e.getLocalizedMessage());
            }
            printUsage(System.err);
            if (instance != null) {
                printInstanceUsage(System.err, instance);
            }
        } catch (Exception e) {
            // instance.run catches IOE, so something is REALLY wrong if here
            LOG.debug("Error", e);
            displayError(cmd, "Fatal internal error");
            e.printStackTrace(System.err);
        }
    }
    tracer.close();
    return exitCode;
}
Also used : Command(org.apache.hadoop.fs.shell.Command) FsCommand(org.apache.hadoop.fs.shell.FsCommand) Tracer(org.apache.htrace.core.Tracer) TraceScope(org.apache.htrace.core.TraceScope) IOException(java.io.IOException)

Example 3 with TraceScope

use of org.apache.htrace.core.TraceScope in project hadoop by apache.

the class FileSystem method createFileSystem.

/**
   * Create and initialize a new instance of a FileSystem.
   * @param uri URI containing the FS schema and FS details
   * @param conf configuration to use to look for the FS instance declaration
   * and to pass to the {@link FileSystem#initialize(URI, Configuration)}.
   * @return the initialized filesystem.
   * @throws IOException problems loading or initializing the FileSystem
   */
private static FileSystem createFileSystem(URI uri, Configuration conf) throws IOException {
    Tracer tracer = FsTracer.get(conf);
    try (TraceScope scope = tracer.newScope("FileSystem#createFileSystem")) {
        scope.addKVAnnotation("scheme", uri.getScheme());
        Class<?> clazz = getFileSystemClass(uri.getScheme(), conf);
        FileSystem fs = (FileSystem) ReflectionUtils.newInstance(clazz, conf);
        fs.initialize(uri, conf);
        return fs;
    }
}
Also used : Tracer(org.apache.htrace.core.Tracer) TraceScope(org.apache.htrace.core.TraceScope)

Example 4 with TraceScope

use of org.apache.htrace.core.TraceScope in project hadoop by apache.

the class Receiver method opTransferBlock.

/** Receive {@link Op#TRANSFER_BLOCK} */
private void opTransferBlock(DataInputStream in) throws IOException {
    final OpTransferBlockProto proto = OpTransferBlockProto.parseFrom(vintPrefixed(in));
    final DatanodeInfo[] targets = PBHelperClient.convert(proto.getTargetsList());
    TraceScope traceScope = continueTraceSpan(proto.getHeader(), proto.getClass().getSimpleName());
    try {
        transferBlock(PBHelperClient.convert(proto.getHeader().getBaseHeader().getBlock()), PBHelperClient.convert(proto.getHeader().getBaseHeader().getToken()), proto.getHeader().getClientName(), targets, PBHelperClient.convertStorageTypes(proto.getTargetStorageTypesList(), targets.length));
    } finally {
        if (traceScope != null)
            traceScope.close();
    }
}
Also used : DatanodeInfo(org.apache.hadoop.hdfs.protocol.DatanodeInfo) TraceScope(org.apache.htrace.core.TraceScope) OpTransferBlockProto(org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos.OpTransferBlockProto)

Example 5 with TraceScope

use of org.apache.htrace.core.TraceScope in project hadoop by apache.

the class Receiver method opRequestShortCircuitShm.

/** Receive {@link Op#REQUEST_SHORT_CIRCUIT_SHM} */
private void opRequestShortCircuitShm(DataInputStream in) throws IOException {
    final ShortCircuitShmRequestProto proto = ShortCircuitShmRequestProto.parseFrom(vintPrefixed(in));
    TraceScope traceScope = continueTraceSpan(proto.getTraceInfo(), proto.getClass().getSimpleName());
    try {
        requestShortCircuitShm(proto.getClientName());
    } finally {
        if (traceScope != null)
            traceScope.close();
    }
}
Also used : TraceScope(org.apache.htrace.core.TraceScope) ShortCircuitShmRequestProto(org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos.ShortCircuitShmRequestProto)

Aggregations

TraceScope (org.apache.htrace.core.TraceScope)54 IOException (java.io.IOException)11 InterruptedIOException (java.io.InterruptedIOException)7 MultipleIOException (org.apache.hadoop.io.MultipleIOException)6 RemoteException (org.apache.hadoop.ipc.RemoteException)5 FileNotFoundException (java.io.FileNotFoundException)4 SnapshotAccessControlException (org.apache.hadoop.hdfs.protocol.SnapshotAccessControlException)4 UnresolvedPathException (org.apache.hadoop.hdfs.protocol.UnresolvedPathException)4 AccessControlException (org.apache.hadoop.security.AccessControlException)4 ClosedChannelException (java.nio.channels.ClosedChannelException)3 FileAlreadyExistsException (org.apache.hadoop.fs.FileAlreadyExistsException)3 ParentNotDirectoryException (org.apache.hadoop.fs.ParentNotDirectoryException)3 DSQuotaExceededException (org.apache.hadoop.hdfs.protocol.DSQuotaExceededException)3 NSQuotaExceededException (org.apache.hadoop.hdfs.protocol.NSQuotaExceededException)3 QuotaByStorageTypeExceededException (org.apache.hadoop.hdfs.protocol.QuotaByStorageTypeExceededException)3 Tracer (org.apache.htrace.core.Tracer)3 ByteBuffer (java.nio.ByteBuffer)2 List (java.util.List)2 EventBatch (org.apache.hadoop.hdfs.inotify.EventBatch)2 DatanodeInfo (org.apache.hadoop.hdfs.protocol.DatanodeInfo)2