Search in sources :

Example 6 with BlockForTest

use of org.apache.hadoop.yarn.webapp.view.BlockForTest in project hadoop by apache.

the class TestAggregatedLogsBlock method testAggregatedLogsBlockHar.

/**
   * Reading from logs should succeed (from a HAR archive) and they should be
   * shown in the AggregatedLogsBlock html.
   *
   * @throws Exception
   */
@Test
public void testAggregatedLogsBlockHar() throws Exception {
    FileUtil.fullyDelete(new File("target/logs"));
    Configuration configuration = getConfiguration();
    URL harUrl = ClassLoader.getSystemClassLoader().getResource("application_1440536969523_0001.har");
    assertNotNull(harUrl);
    String path = "target/logs/admin/logs/application_1440536969523_0001" + "/application_1440536969523_0001.har";
    FileUtils.copyDirectory(new File(harUrl.getPath()), new File(path));
    AggregatedLogsBlockForTest aggregatedBlock = getAggregatedLogsBlockForTest(configuration, "admin", "container_1440536969523_0001_01_000001", "host1:1111");
    ByteArrayOutputStream data = new ByteArrayOutputStream();
    PrintWriter printWriter = new PrintWriter(data);
    HtmlBlock html = new HtmlBlockForTest();
    HtmlBlock.Block block = new BlockForTest(html, printWriter, 10, false);
    aggregatedBlock.render(block);
    block.getWriter().flush();
    String out = data.toString();
    assertTrue(out.contains("Hello stderr"));
    assertTrue(out.contains("Hello stdout"));
    assertTrue(out.contains("Hello syslog"));
    aggregatedBlock = getAggregatedLogsBlockForTest(configuration, "admin", "container_1440536969523_0001_01_000002", "host2:2222");
    data = new ByteArrayOutputStream();
    printWriter = new PrintWriter(data);
    html = new HtmlBlockForTest();
    block = new BlockForTest(html, printWriter, 10, false);
    aggregatedBlock.render(block);
    block.getWriter().flush();
    out = data.toString();
    assertTrue(out.contains("Goodbye stderr"));
    assertTrue(out.contains("Goodbye stdout"));
    assertTrue(out.contains("Goodbye syslog"));
}
Also used : HtmlBlockForTest(org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) Configuration(org.apache.hadoop.conf.Configuration) AggregatedLogsBlockForTest(org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest) ByteArrayOutputStream(java.io.ByteArrayOutputStream) HtmlBlock(org.apache.hadoop.yarn.webapp.view.HtmlBlock) File(java.io.File) URL(java.net.URL) PrintWriter(java.io.PrintWriter) AggregatedLogsBlockForTest(org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) HtmlBlockForTest(org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest) AggregatedLogsBlockForTest(org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) HtmlBlockForTest(org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest) Test(org.junit.Test)

Example 7 with BlockForTest

use of org.apache.hadoop.yarn.webapp.view.BlockForTest in project hadoop by apache.

the class TestAggregatedLogsBlock method testAccessDenied.

/**
   * Bad user. User 'owner' is trying to read logs without access
   */
@Test
public void testAccessDenied() throws Exception {
    FileUtil.fullyDelete(new File("target/logs"));
    Configuration configuration = getConfiguration();
    writeLogs("target/logs/logs/application_0_0001/container_0_0001_01_000001");
    writeLog(configuration, "owner");
    AggregatedLogsBlockForTest aggregatedBlock = getAggregatedLogsBlockForTest(configuration, "owner", "container_0_0001_01_000001");
    ByteArrayOutputStream data = new ByteArrayOutputStream();
    PrintWriter printWriter = new PrintWriter(data);
    HtmlBlock html = new HtmlBlockForTest();
    HtmlBlock.Block block = new BlockForTest(html, printWriter, 10, false);
    aggregatedBlock.render(block);
    block.getWriter().flush();
    String out = data.toString();
    assertTrue(out.contains("User [owner] is not authorized to view the logs for entity"));
}
Also used : HtmlBlockForTest(org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) Configuration(org.apache.hadoop.conf.Configuration) AggregatedLogsBlockForTest(org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest) ByteArrayOutputStream(java.io.ByteArrayOutputStream) HtmlBlock(org.apache.hadoop.yarn.webapp.view.HtmlBlock) File(java.io.File) PrintWriter(java.io.PrintWriter) AggregatedLogsBlockForTest(org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) HtmlBlockForTest(org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest) AggregatedLogsBlockForTest(org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) HtmlBlockForTest(org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest) Test(org.junit.Test)

Example 8 with BlockForTest

use of org.apache.hadoop.yarn.webapp.view.BlockForTest in project hadoop by apache.

the class TestBlocks method testHsJobsBlock.

/**
   * test HsJobsBlock's rendering.
   */
@Test
public void testHsJobsBlock() {
    AppContext ctx = mock(AppContext.class);
    Map<JobId, Job> jobs = new HashMap<JobId, Job>();
    Job job = getJob();
    jobs.put(job.getID(), job);
    when(ctx.getAllJobs()).thenReturn(jobs);
    HsJobsBlock block = new HsJobsBlockForTest(ctx);
    PrintWriter pWriter = new PrintWriter(data);
    Block html = new BlockForTest(new HtmlBlockForTest(), pWriter, 0, false);
    block.render(html);
    pWriter.flush();
    assertTrue(data.toString().contains("JobName"));
    assertTrue(data.toString().contains("UserName"));
    assertTrue(data.toString().contains("QueueName"));
    assertTrue(data.toString().contains("SUCCEEDED"));
}
Also used : HashMap(java.util.HashMap) AppContext(org.apache.hadoop.mapreduce.v2.app.AppContext) HtmlBlock(org.apache.hadoop.yarn.webapp.view.HtmlBlock) Block(org.apache.hadoop.yarn.webapp.view.HtmlBlock.Block) AttemptsBlock(org.apache.hadoop.mapreduce.v2.hs.webapp.HsTaskPage.AttemptsBlock) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) JobId(org.apache.hadoop.mapreduce.v2.api.records.JobId) PrintWriter(java.io.PrintWriter) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) Test(org.junit.Test) AppForTest(org.apache.hadoop.mapreduce.v2.app.webapp.AppForTest)

Example 9 with BlockForTest

use of org.apache.hadoop.yarn.webapp.view.BlockForTest in project hadoop by apache.

the class TestBlocks method testAttemptsBlock.

/**
   * test AttemptsBlock's rendering.
   */
@Test
public void testAttemptsBlock() {
    AppContext ctx = mock(AppContext.class);
    AppForTest app = new AppForTest(ctx);
    Task task = getTask(0);
    Map<TaskAttemptId, TaskAttempt> attempts = new HashMap<TaskAttemptId, TaskAttempt>();
    TaskAttempt attempt = mock(TaskAttempt.class);
    TaskAttemptId taId = new TaskAttemptIdPBImpl();
    taId.setId(0);
    taId.setTaskId(task.getID());
    when(attempt.getID()).thenReturn(taId);
    when(attempt.getNodeHttpAddress()).thenReturn("Node address");
    ApplicationId appId = ApplicationIdPBImpl.newInstance(0, 5);
    ApplicationAttemptId appAttemptId = ApplicationAttemptIdPBImpl.newInstance(appId, 1);
    ContainerId containerId = ContainerIdPBImpl.newContainerId(appAttemptId, 1);
    when(attempt.getAssignedContainerID()).thenReturn(containerId);
    when(attempt.getAssignedContainerMgrAddress()).thenReturn("assignedContainerMgrAddress");
    when(attempt.getNodeRackName()).thenReturn("nodeRackName");
    final long taStartTime = 100002L;
    final long taFinishTime = 100012L;
    final long taShuffleFinishTime = 100010L;
    final long taSortFinishTime = 100011L;
    final TaskAttemptState taState = TaskAttemptState.SUCCEEDED;
    when(attempt.getLaunchTime()).thenReturn(taStartTime);
    when(attempt.getFinishTime()).thenReturn(taFinishTime);
    when(attempt.getShuffleFinishTime()).thenReturn(taShuffleFinishTime);
    when(attempt.getSortFinishTime()).thenReturn(taSortFinishTime);
    when(attempt.getState()).thenReturn(taState);
    TaskAttemptReport taReport = mock(TaskAttemptReport.class);
    when(taReport.getStartTime()).thenReturn(taStartTime);
    when(taReport.getFinishTime()).thenReturn(taFinishTime);
    when(taReport.getShuffleFinishTime()).thenReturn(taShuffleFinishTime);
    when(taReport.getSortFinishTime()).thenReturn(taSortFinishTime);
    when(taReport.getContainerId()).thenReturn(containerId);
    when(taReport.getProgress()).thenReturn(1.0f);
    when(taReport.getStateString()).thenReturn("Processed 128/128 records <p> \n");
    when(taReport.getTaskAttemptState()).thenReturn(taState);
    when(taReport.getDiagnosticInfo()).thenReturn("");
    when(attempt.getReport()).thenReturn(taReport);
    attempts.put(taId, attempt);
    when(task.getAttempts()).thenReturn(attempts);
    app.setTask(task);
    Job job = mock(Job.class);
    when(job.getUserName()).thenReturn("User");
    app.setJob(job);
    AttemptsBlockForTest block = new AttemptsBlockForTest(app);
    block.addParameter(AMParams.TASK_TYPE, "r");
    PrintWriter pWriter = new PrintWriter(data);
    Block html = new BlockForTest(new HtmlBlockForTest(), pWriter, 0, false);
    block.render(html);
    pWriter.flush();
    // should be printed information about attempts
    assertTrue(data.toString().contains("attempt_0_0001_r_000000_0"));
    assertTrue(data.toString().contains("SUCCEEDED"));
    assertFalse(data.toString().contains("Processed 128/128 records <p> \n"));
    assertTrue(data.toString().contains("Processed 128\\/128 records &lt;p&gt; \\n"));
    assertTrue(data.toString().contains("_0005_01_000001:attempt_0_0001_r_000000_0:User:"));
    assertTrue(data.toString().contains("100002"));
    assertTrue(data.toString().contains("100010"));
    assertTrue(data.toString().contains("100011"));
    assertTrue(data.toString().contains("100012"));
}
Also used : Task(org.apache.hadoop.mapreduce.v2.app.job.Task) TaskAttemptIdPBImpl(org.apache.hadoop.mapreduce.v2.api.records.impl.pb.TaskAttemptIdPBImpl) HashMap(java.util.HashMap) TaskAttemptId(org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptId) AppContext(org.apache.hadoop.mapreduce.v2.app.AppContext) ApplicationAttemptId(org.apache.hadoop.yarn.api.records.ApplicationAttemptId) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) TaskAttemptReport(org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptReport) ContainerId(org.apache.hadoop.yarn.api.records.ContainerId) TaskAttemptState(org.apache.hadoop.mapreduce.v2.api.records.TaskAttemptState) HtmlBlock(org.apache.hadoop.yarn.webapp.view.HtmlBlock) Block(org.apache.hadoop.yarn.webapp.view.HtmlBlock.Block) AttemptsBlock(org.apache.hadoop.mapreduce.v2.hs.webapp.HsTaskPage.AttemptsBlock) TaskAttempt(org.apache.hadoop.mapreduce.v2.app.job.TaskAttempt) AppForTest(org.apache.hadoop.mapreduce.v2.app.webapp.AppForTest) ApplicationId(org.apache.hadoop.yarn.api.records.ApplicationId) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) PrintWriter(java.io.PrintWriter) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) Test(org.junit.Test) AppForTest(org.apache.hadoop.mapreduce.v2.app.webapp.AppForTest)

Example 10 with BlockForTest

use of org.apache.hadoop.yarn.webapp.view.BlockForTest in project hadoop by apache.

the class TestBlocks method testSingleCounterBlock.

@Test
public void testSingleCounterBlock() {
    AppContext appCtx = mock(AppContext.class);
    View.ViewContext ctx = mock(View.ViewContext.class);
    JobId jobId = new JobIdPBImpl();
    jobId.setId(0);
    jobId.setAppId(ApplicationIdPBImpl.newInstance(0, 1));
    TaskId mapTaskId = new TaskIdPBImpl();
    mapTaskId.setId(0);
    mapTaskId.setTaskType(TaskType.MAP);
    mapTaskId.setJobId(jobId);
    Task mapTask = mock(Task.class);
    when(mapTask.getID()).thenReturn(mapTaskId);
    TaskReport mapReport = mock(TaskReport.class);
    when(mapTask.getReport()).thenReturn(mapReport);
    when(mapTask.getType()).thenReturn(TaskType.MAP);
    TaskId reduceTaskId = new TaskIdPBImpl();
    reduceTaskId.setId(0);
    reduceTaskId.setTaskType(TaskType.REDUCE);
    reduceTaskId.setJobId(jobId);
    Task reduceTask = mock(Task.class);
    when(reduceTask.getID()).thenReturn(reduceTaskId);
    TaskReport reduceReport = mock(TaskReport.class);
    when(reduceTask.getReport()).thenReturn(reduceReport);
    when(reduceTask.getType()).thenReturn(TaskType.REDUCE);
    Map<TaskId, Task> tasks = new HashMap<TaskId, Task>();
    tasks.put(mapTaskId, mapTask);
    tasks.put(reduceTaskId, reduceTask);
    Job job = mock(Job.class);
    when(job.getTasks()).thenReturn(tasks);
    when(appCtx.getJob(any(JobId.class))).thenReturn(job);
    // SingleCounter for map task
    SingleCounterBlockForMapTest blockForMapTest = spy(new SingleCounterBlockForMapTest(appCtx, ctx));
    PrintWriter pWriterForMapTest = new PrintWriter(data);
    Block htmlForMapTest = new BlockForTest(new HtmlBlockForTest(), pWriterForMapTest, 0, false);
    blockForMapTest.render(htmlForMapTest);
    pWriterForMapTest.flush();
    assertTrue(data.toString().contains("task_0_0001_m_000000"));
    assertFalse(data.toString().contains("task_0_0001_r_000000"));
    data.reset();
    // SingleCounter for reduce task
    SingleCounterBlockForReduceTest blockForReduceTest = spy(new SingleCounterBlockForReduceTest(appCtx, ctx));
    PrintWriter pWriterForReduceTest = new PrintWriter(data);
    Block htmlForReduceTest = new BlockForTest(new HtmlBlockForTest(), pWriterForReduceTest, 0, false);
    blockForReduceTest.render(htmlForReduceTest);
    pWriterForReduceTest.flush();
    System.out.println(data.toString());
    assertFalse(data.toString().contains("task_0_0001_m_000000"));
    assertTrue(data.toString().contains("task_0_0001_r_000000"));
}
Also used : Task(org.apache.hadoop.mapreduce.v2.app.job.Task) TaskId(org.apache.hadoop.mapreduce.v2.api.records.TaskId) TaskReport(org.apache.hadoop.mapreduce.v2.api.records.TaskReport) TaskIdPBImpl(org.apache.hadoop.mapreduce.v2.api.records.impl.pb.TaskIdPBImpl) HashMap(java.util.HashMap) AppContext(org.apache.hadoop.mapreduce.v2.app.AppContext) View(org.apache.hadoop.yarn.webapp.View) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) JobIdPBImpl(org.apache.hadoop.mapreduce.v2.api.records.impl.pb.JobIdPBImpl) HtmlBlock(org.apache.hadoop.yarn.webapp.view.HtmlBlock) Block(org.apache.hadoop.yarn.webapp.view.HtmlBlock.Block) FewAttemptsBlock(org.apache.hadoop.mapreduce.v2.app.webapp.AttemptsPage.FewAttemptsBlock) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) JobId(org.apache.hadoop.mapreduce.v2.api.records.JobId) PrintWriter(java.io.PrintWriter) BlockForTest(org.apache.hadoop.yarn.webapp.view.BlockForTest) Test(org.junit.Test)

Aggregations

PrintWriter (java.io.PrintWriter)14 BlockForTest (org.apache.hadoop.yarn.webapp.view.BlockForTest)14 HtmlBlock (org.apache.hadoop.yarn.webapp.view.HtmlBlock)14 Test (org.junit.Test)13 Configuration (org.apache.hadoop.conf.Configuration)8 AppContext (org.apache.hadoop.mapreduce.v2.app.AppContext)7 Job (org.apache.hadoop.mapreduce.v2.app.job.Job)7 Block (org.apache.hadoop.yarn.webapp.view.HtmlBlock.Block)7 HtmlBlockForTest (org.apache.hadoop.yarn.webapp.view.HtmlBlockForTest)7 ByteArrayOutputStream (java.io.ByteArrayOutputStream)6 File (java.io.File)6 HashMap (java.util.HashMap)6 YarnConfiguration (org.apache.hadoop.yarn.conf.YarnConfiguration)6 AggregatedLogsBlockForTest (org.apache.hadoop.yarn.webapp.log.AggregatedLogsBlockForTest)6 JobId (org.apache.hadoop.mapreduce.v2.api.records.JobId)5 Task (org.apache.hadoop.mapreduce.v2.app.job.Task)5 TaskId (org.apache.hadoop.mapreduce.v2.api.records.TaskId)4 FewAttemptsBlock (org.apache.hadoop.mapreduce.v2.app.webapp.AttemptsPage.FewAttemptsBlock)4 TaskReport (org.apache.hadoop.mapreduce.v2.api.records.TaskReport)3 JobIdPBImpl (org.apache.hadoop.mapreduce.v2.api.records.impl.pb.JobIdPBImpl)3