Search in sources :

Example 11 with Counters

use of org.apache.hadoop.mapreduce.v2.api.records.Counters in project hadoop by apache.

the class JobImpl method constructFinalFullcounters.

@Private
public void constructFinalFullcounters() {
    this.fullCounters = new Counters();
    this.finalMapCounters = new Counters();
    this.finalReduceCounters = new Counters();
    this.fullCounters.incrAllCounters(jobCounters);
    for (Task t : this.tasks.values()) {
        Counters counters = t.getCounters();
        switch(t.getType()) {
            case MAP:
                this.finalMapCounters.incrAllCounters(counters);
                break;
            case REDUCE:
                this.finalReduceCounters.incrAllCounters(counters);
                break;
            default:
                throw new IllegalStateException("Task type neither map nor reduce: " + t.getType());
        }
        this.fullCounters.incrAllCounters(counters);
    }
}
Also used : Task(org.apache.hadoop.mapreduce.v2.app.job.Task) Counters(org.apache.hadoop.mapreduce.Counters) Private(org.apache.hadoop.classification.InterfaceAudience.Private)

Example 12 with Counters

use of org.apache.hadoop.mapreduce.v2.api.records.Counters in project hadoop by apache.

the class AMWebServices method getJobCounters.

@GET
@Path("/jobs/{jobid}/counters")
@Produces({ MediaType.APPLICATION_JSON + "; " + JettyUtils.UTF_8, MediaType.APPLICATION_XML + "; " + JettyUtils.UTF_8 })
public JobCounterInfo getJobCounters(@Context HttpServletRequest hsr, @PathParam("jobid") String jid) {
    init();
    Job job = getJobFromJobIdString(jid, appCtx);
    checkAccess(job, hsr);
    return new JobCounterInfo(this.appCtx, job);
}
Also used : JobCounterInfo(org.apache.hadoop.mapreduce.v2.app.webapp.dao.JobCounterInfo) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) Path(javax.ws.rs.Path) Produces(javax.ws.rs.Produces) GET(javax.ws.rs.GET)

Example 13 with Counters

use of org.apache.hadoop.mapreduce.v2.api.records.Counters in project hadoop by apache.

the class AMWebServices method getSingleTaskCounters.

@GET
@Path("/jobs/{jobid}/tasks/{taskid}/counters")
@Produces({ MediaType.APPLICATION_JSON + "; " + JettyUtils.UTF_8, MediaType.APPLICATION_XML + "; " + JettyUtils.UTF_8 })
public JobTaskCounterInfo getSingleTaskCounters(@Context HttpServletRequest hsr, @PathParam("jobid") String jid, @PathParam("taskid") String tid) {
    init();
    Job job = getJobFromJobIdString(jid, appCtx);
    checkAccess(job, hsr);
    Task task = getTaskFromTaskIdString(tid, job);
    return new JobTaskCounterInfo(task);
}
Also used : Task(org.apache.hadoop.mapreduce.v2.app.job.Task) JobTaskCounterInfo(org.apache.hadoop.mapreduce.v2.app.webapp.dao.JobTaskCounterInfo) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) Path(javax.ws.rs.Path) Produces(javax.ws.rs.Produces) GET(javax.ws.rs.GET)

Example 14 with Counters

use of org.apache.hadoop.mapreduce.v2.api.records.Counters in project hadoop by apache.

the class TaskAttemptListenerImpl method reportDiagnosticInfo.

@Override
public void reportDiagnosticInfo(TaskAttemptID taskAttemptID, String diagnosticInfo) throws IOException {
    diagnosticInfo = StringInterner.weakIntern(diagnosticInfo);
    LOG.info("Diagnostics report from " + taskAttemptID.toString() + ": " + diagnosticInfo);
    org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptId attemptID = TypeConverter.toYarn(taskAttemptID);
    taskHeartbeatHandler.progressing(attemptID);
    // This is mainly used for cases where we want to propagate exception traces
    // of tasks that fail.
    // This call exists as a hadoop mapreduce legacy wherein all changes in
    // counters/progress/phase/output-size are reported through statusUpdate()
    // call but not diagnosticInformation.
    context.getEventHandler().handle(new TaskAttemptDiagnosticsUpdateEvent(attemptID, diagnosticInfo));
}
Also used : TaskAttemptDiagnosticsUpdateEvent(org.apache.hadoop.mapreduce.v2.app.job.event.TaskAttemptDiagnosticsUpdateEvent)

Example 15 with Counters

use of org.apache.hadoop.mapreduce.v2.api.records.Counters in project hadoop by apache.

the class TestFetchFailure method updateStatus.

private void updateStatus(MRApp app, TaskAttempt attempt, Phase phase) {
    TaskAttemptStatusUpdateEvent.TaskAttemptStatus status = new TaskAttemptStatusUpdateEvent.TaskAttemptStatus();
    status.counters = new Counters();
    status.fetchFailedMaps = new ArrayList<TaskAttemptId>();
    status.id = attempt.getID();
    status.mapFinishTime = 0;
    status.phase = phase;
    status.progress = 0.5f;
    status.shuffleFinishTime = 0;
    status.sortFinishTime = 0;
    status.stateString = "OK";
    status.taskState = attempt.getState();
    TaskAttemptStatusUpdateEvent event = new TaskAttemptStatusUpdateEvent(attempt.getID(), status);
    app.getContext().getEventHandler().handle(event);
}
Also used : TaskAttemptStatusUpdateEvent(org.apache.hadoop.mapreduce.v2.app.job.event.TaskAttemptStatusUpdateEvent) TaskAttemptId(org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptId) Counters(org.apache.hadoop.mapreduce.Counters)

Aggregations

Job (org.apache.hadoop.mapreduce.v2.app.job.Job)36 Test (org.junit.Test)34 JobId (org.apache.hadoop.mapreduce.v2.api.records.JobId)29 Task (org.apache.hadoop.mapreduce.v2.app.job.Task)28 ClientResponse (com.sun.jersey.api.client.ClientResponse)21 WebResource (com.sun.jersey.api.client.WebResource)21 Counters (org.apache.hadoop.mapreduce.Counters)18 TaskAttempt (org.apache.hadoop.mapreduce.v2.app.job.TaskAttempt)16 TaskAttemptId (org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptId)15 JSONObject (org.codehaus.jettison.json.JSONObject)15 TaskId (org.apache.hadoop.mapreduce.v2.api.records.TaskId)13 Configuration (org.apache.hadoop.conf.Configuration)9 Counters (org.apache.hadoop.mapreduce.v2.api.records.Counters)8 StringReader (java.io.StringReader)6 GET (javax.ws.rs.GET)6 Path (javax.ws.rs.Path)6 Produces (javax.ws.rs.Produces)6 DocumentBuilder (javax.xml.parsers.DocumentBuilder)6 DocumentBuilderFactory (javax.xml.parsers.DocumentBuilderFactory)6 CounterGroup (org.apache.hadoop.mapreduce.v2.api.records.CounterGroup)6