Search in sources :

Example 6 with CompletedJob

use of org.apache.hadoop.mapreduce.v2.hs.CompletedJob in project hadoop by apache.

the class TestHistoryFileManager method testHistoryFileInfoShouldReturnCompletedJobIfMaxNotConfiged.

@Test
public void testHistoryFileInfoShouldReturnCompletedJobIfMaxNotConfiged() throws Exception {
    HistoryFileManagerTest hmTest = new HistoryFileManagerTest();
    Configuration conf = dfsCluster.getConfiguration(0);
    conf.setInt(JHAdminConfig.MR_HS_LOADED_JOBS_TASKS_MAX, -1);
    hmTest.init(conf);
    final String jobId = "job_1416424547277_0002";
    JobIndexInfo jobIndexInfo = new JobIndexInfo();
    jobIndexInfo.setJobId(TypeConverter.toYarn(JobID.forName(jobId)));
    jobIndexInfo.setNumMaps(100);
    jobIndexInfo.setNumReduces(100);
    final String historyFile = getClass().getClassLoader().getResource("job_2.0.3-alpha-FAILED.jhist").getFile();
    final Path historyFilePath = FileSystem.getLocal(conf).makeQualified(new Path(historyFile));
    HistoryFileInfo info = hmTest.getHistoryFileInfo(historyFilePath, null, null, jobIndexInfo, false);
    Job job = info.loadJob();
    Assert.assertTrue("Should return an instance of CompletedJob as " + "a result of parsing the job history file of the job", job instanceof CompletedJob);
}
Also used : Path(org.apache.hadoop.fs.Path) HistoryFileInfo(org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.HistoryFileInfo) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) Configuration(org.apache.hadoop.conf.Configuration) HdfsConfiguration(org.apache.hadoop.hdfs.HdfsConfiguration) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) JobIndexInfo(org.apache.hadoop.mapreduce.v2.jobhistory.JobIndexInfo) Test(org.junit.Test)

Example 7 with CompletedJob

use of org.apache.hadoop.mapreduce.v2.hs.CompletedJob in project hadoop by apache.

the class TestHistoryFileManager method testHistoryFileInfoLoadNormalSizedJobShouldReturnCompletedJob.

@Test
public void testHistoryFileInfoLoadNormalSizedJobShouldReturnCompletedJob() throws Exception {
    HistoryFileManagerTest hmTest = new HistoryFileManagerTest();
    final int numOfTasks = 100;
    Configuration conf = dfsCluster.getConfiguration(0);
    conf.setInt(JHAdminConfig.MR_HS_LOADED_JOBS_TASKS_MAX, numOfTasks + numOfTasks + 1);
    hmTest.init(conf);
    // set up a job of which the number of tasks is smaller than the maximum
    // allowed, and therefore will be fully loaded.
    final String jobId = "job_1416424547277_0002";
    JobIndexInfo jobIndexInfo = new JobIndexInfo();
    jobIndexInfo.setJobId(TypeConverter.toYarn(JobID.forName(jobId)));
    jobIndexInfo.setNumMaps(numOfTasks);
    jobIndexInfo.setNumReduces(numOfTasks);
    final String historyFile = getClass().getClassLoader().getResource("job_2.0.3-alpha-FAILED.jhist").getFile();
    final Path historyFilePath = FileSystem.getLocal(conf).makeQualified(new Path(historyFile));
    HistoryFileInfo info = hmTest.getHistoryFileInfo(historyFilePath, null, null, jobIndexInfo, false);
    Job job = info.loadJob();
    Assert.assertTrue("Should return an instance of CompletedJob as " + "a result of parsing the job history file of the job", job instanceof CompletedJob);
}
Also used : Path(org.apache.hadoop.fs.Path) HistoryFileInfo(org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.HistoryFileInfo) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) Configuration(org.apache.hadoop.conf.Configuration) HdfsConfiguration(org.apache.hadoop.hdfs.HdfsConfiguration) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) JobIndexInfo(org.apache.hadoop.mapreduce.v2.jobhistory.JobIndexInfo) Test(org.junit.Test)

Example 8 with CompletedJob

use of org.apache.hadoop.mapreduce.v2.hs.CompletedJob in project hadoop by apache.

the class TestJobHistoryEntities method testCompletedJob.

/* Verify some expected values based on the history file */
@Test(timeout = 100000)
public void testCompletedJob() throws Exception {
    HistoryFileInfo info = mock(HistoryFileInfo.class);
    when(info.getConfFile()).thenReturn(fullConfPath);
    when(info.getHistoryFile()).thenReturn(fullHistoryPath);
    //Re-initialize to verify the delayed load.
    completedJob = new CompletedJob(conf, jobId, fullHistoryPath, loadTasks, "user", info, jobAclsManager);
    //Verify tasks loaded based on loadTask parameter.
    assertEquals(loadTasks, completedJob.tasksLoaded.get());
    assertEquals(1, completedJob.getAMInfos().size());
    assertEquals(10, completedJob.getCompletedMaps());
    assertEquals(1, completedJob.getCompletedReduces());
    assertEquals(12, completedJob.getTasks().size());
    //Verify tasks loaded at this point.
    assertEquals(true, completedJob.tasksLoaded.get());
    assertEquals(10, completedJob.getTasks(TaskType.MAP).size());
    assertEquals(2, completedJob.getTasks(TaskType.REDUCE).size());
    assertEquals("user", completedJob.getUserName());
    assertEquals(JobState.SUCCEEDED, completedJob.getState());
    JobReport jobReport = completedJob.getReport();
    assertEquals("user", jobReport.getUser());
    assertEquals(JobState.SUCCEEDED, jobReport.getJobState());
    assertEquals(fullHistoryPath.toString(), jobReport.getHistoryFile());
}
Also used : HistoryFileInfo(org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.HistoryFileInfo) JobReport(org.apache.hadoop.mapreduce.v2.api.records.JobReport) Test(org.junit.Test)

Example 9 with CompletedJob

use of org.apache.hadoop.mapreduce.v2.hs.CompletedJob in project hadoop by apache.

the class TestJobInfo method testAverageMergeTime.

@Test(timeout = 10000)
public void testAverageMergeTime() throws IOException {
    String historyFileName = "job_1329348432655_0001-1329348443227-user-Sleep+job-1329348468601-10-1-SUCCEEDED-default.jhist";
    String confFileName = "job_1329348432655_0001_conf.xml";
    Configuration conf = new Configuration();
    JobACLsManager jobAclsMgr = new JobACLsManager(conf);
    Path fulleHistoryPath = new Path(TestJobHistoryEntities.class.getClassLoader().getResource(historyFileName).getFile());
    Path fullConfPath = new Path(TestJobHistoryEntities.class.getClassLoader().getResource(confFileName).getFile());
    HistoryFileInfo info = mock(HistoryFileInfo.class);
    when(info.getConfFile()).thenReturn(fullConfPath);
    when(info.getHistoryFile()).thenReturn(fulleHistoryPath);
    JobId jobId = MRBuilderUtils.newJobId(1329348432655l, 1, 1);
    CompletedJob completedJob = new CompletedJob(conf, jobId, fulleHistoryPath, true, "user", info, jobAclsMgr);
    JobInfo jobInfo = new JobInfo(completedJob);
    // There are 2 tasks with merge time of 45 and 55 respectively. So average
    // merge time should be 50.
    Assert.assertEquals(50L, jobInfo.getAvgMergeTime().longValue());
}
Also used : Path(org.apache.hadoop.fs.Path) TestJobHistoryEntities(org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities) HistoryFileInfo(org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.HistoryFileInfo) CompletedJob(org.apache.hadoop.mapreduce.v2.hs.CompletedJob) Configuration(org.apache.hadoop.conf.Configuration) JobACLsManager(org.apache.hadoop.mapred.JobACLsManager) JobId(org.apache.hadoop.mapreduce.v2.api.records.JobId) Test(org.junit.Test)

Aggregations

HistoryFileInfo (org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.HistoryFileInfo)9 Test (org.junit.Test)9 Path (org.apache.hadoop.fs.Path)4 Configuration (org.apache.hadoop.conf.Configuration)3 HdfsConfiguration (org.apache.hadoop.hdfs.HdfsConfiguration)2 JobReport (org.apache.hadoop.mapreduce.v2.api.records.JobReport)2 TaskId (org.apache.hadoop.mapreduce.v2.api.records.TaskId)2 Job (org.apache.hadoop.mapreduce.v2.app.job.Job)2 Task (org.apache.hadoop.mapreduce.v2.app.job.Task)2 JobIndexInfo (org.apache.hadoop.mapreduce.v2.jobhistory.JobIndexInfo)2 YarnConfiguration (org.apache.hadoop.yarn.conf.YarnConfiguration)2 JobACLsManager (org.apache.hadoop.mapred.JobACLsManager)1 TaskCompletionEvent (org.apache.hadoop.mapred.TaskCompletionEvent)1 JobHistoryParser (org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser)1 JobInfo (org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.JobInfo)1 JobId (org.apache.hadoop.mapreduce.v2.api.records.JobId)1 TaskAttemptId (org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptId)1 TaskAttemptReport (org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptReport)1 TaskReport (org.apache.hadoop.mapreduce.v2.api.records.TaskReport)1 TaskAttempt (org.apache.hadoop.mapreduce.v2.app.job.TaskAttempt)1