Search in sources :

Example 1 with WebApp

use of org.apache.hadoop.yarn.webapp.WebApp in project hadoop by apache.

the class TestHsWebServicesJobs method testJobCountersForKilledJob.

@Test
public void testJobCountersForKilledJob() throws Exception {
    WebResource r = resource();
    appContext = new MockHistoryContext(0, 1, 1, 1, true);
    GuiceServletConfig.setInjector(Guice.createInjector(new ServletModule() {

        @Override
        protected void configureServlets() {
            webApp = mock(HsWebApp.class);
            when(webApp.name()).thenReturn("hsmockwebapp");
            bind(JAXBContextResolver.class);
            bind(HsWebServices.class);
            bind(GenericExceptionHandler.class);
            bind(WebApp.class).toInstance(webApp);
            bind(AppContext.class).toInstance(appContext);
            bind(HistoryContext.class).toInstance(appContext);
            bind(Configuration.class).toInstance(conf);
            serve("/*").with(GuiceContainer.class);
        }
    }));
    Map<JobId, Job> jobsMap = appContext.getAllJobs();
    for (JobId id : jobsMap.keySet()) {
        String jobId = MRApps.toString(id);
        ClientResponse response = r.path("ws").path("v1").path("history").path("mapreduce").path("jobs").path(jobId).path("counters/").accept(MediaType.APPLICATION_JSON).get(ClientResponse.class);
        assertEquals(MediaType.APPLICATION_JSON_TYPE + "; " + JettyUtils.UTF_8, response.getType().toString());
        JSONObject json = response.getEntity(JSONObject.class);
        assertEquals("incorrect number of elements", 1, json.length());
        JSONObject info = json.getJSONObject("jobCounters");
        WebServicesTestUtils.checkStringMatch("id", MRApps.toString(id), info.getString("id"));
        assertTrue("Job shouldn't contain any counters", info.length() == 1);
    }
}
Also used : ClientResponse(com.sun.jersey.api.client.ClientResponse) Configuration(org.apache.hadoop.conf.Configuration) MockHistoryContext(org.apache.hadoop.mapreduce.v2.hs.MockHistoryContext) AppContext(org.apache.hadoop.mapreduce.v2.app.AppContext) MockHistoryContext(org.apache.hadoop.mapreduce.v2.hs.MockHistoryContext) HistoryContext(org.apache.hadoop.mapreduce.v2.hs.HistoryContext) WebResource(com.sun.jersey.api.client.WebResource) ServletModule(com.google.inject.servlet.ServletModule) JSONObject(org.codehaus.jettison.json.JSONObject) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) JobId(org.apache.hadoop.mapreduce.v2.api.records.JobId) WebApp(org.apache.hadoop.yarn.webapp.WebApp) Test(org.junit.Test)

Example 2 with WebApp

use of org.apache.hadoop.yarn.webapp.WebApp in project hadoop by apache.

the class TestHsWebServicesAcls method setup.

@Before
public void setup() throws IOException {
    this.conf = new JobConf();
    this.conf.set(CommonConfigurationKeys.HADOOP_SECURITY_GROUP_MAPPING, NullGroupsProvider.class.getName());
    this.conf.setBoolean(MRConfig.MR_ACLS_ENABLED, true);
    Groups.getUserToGroupsMappingService(conf);
    this.ctx = buildHistoryContext(this.conf);
    WebApp webApp = mock(HsWebApp.class);
    when(webApp.name()).thenReturn("hsmockwebapp");
    this.hsWebServices = new HsWebServices(ctx, conf, webApp);
    this.hsWebServices.setResponse(mock(HttpServletResponse.class));
    Job job = ctx.getAllJobs().values().iterator().next();
    this.jobIdStr = job.getID().toString();
    Task task = job.getTasks().values().iterator().next();
    this.taskIdStr = task.getID().toString();
    this.taskAttemptIdStr = task.getAttempts().keySet().iterator().next().toString();
}
Also used : Task(org.apache.hadoop.mapreduce.v2.app.job.Task) HttpServletResponse(javax.servlet.http.HttpServletResponse) Job(org.apache.hadoop.mapreduce.v2.app.job.Job) JobConf(org.apache.hadoop.mapred.JobConf) WebApp(org.apache.hadoop.yarn.webapp.WebApp) Before(org.junit.Before)

Example 3 with WebApp

use of org.apache.hadoop.yarn.webapp.WebApp in project apex-core by apache.

the class StreamingAppMasterService method serviceStart.

@Override
protected void serviceStart() throws Exception {
    super.serviceStart();
    if (UserGroupInformation.isSecurityEnabled()) {
        delegationTokenManager.startThreads();
    }
    // write the connect address for containers to DFS
    InetSocketAddress connectAddress = NetUtils.getConnectAddress(this.heartbeatListener.getAddress());
    URI connectUri = RecoverableRpcProxy.toConnectURI(connectAddress);
    FSRecoveryHandler recoveryHandler = new FSRecoveryHandler(dag.assertAppPath(), getConfig());
    recoveryHandler.writeConnectUri(connectUri.toString());
    // start web service
    try {
        org.mortbay.log.Log.setLog(null);
    } catch (Throwable throwable) {
    // SPOI-2687. As part of Pivotal Certification, we need to catch ClassNotFoundException as Pivotal was using
    // Jetty 7 where as other distros are using Jetty 6.
    // LOG.error("can't set the log to null: ", throwable);
    }
    try {
        Configuration config = getConfig();
        if (SecurityUtils.isStramWebSecurityEnabled()) {
            config = new Configuration(config);
            config.set("hadoop.http.filter.initializers", StramWSFilterInitializer.class.getCanonicalName());
        }
        String customSSLConfig = dag.getValue(LogicalPlan.STRAM_HTTP_CUSTOM_CONFIG);
        if (StringUtils.isNotEmpty(customSSLConfig)) {
            config.addResource(new Path(customSSLConfig));
        }
        WebApp webApp = WebApps.$for("stram", StramAppContext.class, appContext, "ws").with(config).start(new StramWebApp(this.dnmgr));
        LOG.info("Started web service at port: " + webApp.port());
        appMasterTrackingUrl = NetUtils.getConnectAddress(webApp.getListenerAddress()).getHostName() + ":" + webApp.port();
        if (ConfigUtils.isSSLEnabled(config)) {
            appMasterTrackingUrl = "https://" + appMasterTrackingUrl;
        }
        LOG.info("Setting tracking URL to: " + appMasterTrackingUrl);
    } catch (Exception e) {
        LOG.error("Webapps failed to start. Ignoring for now:", e);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) Configuration(org.apache.hadoop.conf.Configuration) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) StramWSFilterInitializer(com.datatorrent.stram.security.StramWSFilterInitializer) InetSocketAddress(java.net.InetSocketAddress) StramWebApp(com.datatorrent.stram.webapp.StramWebApp) URI(java.net.URI) YarnException(org.apache.hadoop.yarn.exceptions.YarnException) IOException(java.io.IOException) YarnRuntimeException(org.apache.hadoop.yarn.exceptions.YarnRuntimeException) WebApp(org.apache.hadoop.yarn.webapp.WebApp) StramWebApp(com.datatorrent.stram.webapp.StramWebApp)

Aggregations

WebApp (org.apache.hadoop.yarn.webapp.WebApp)3 Configuration (org.apache.hadoop.conf.Configuration)2 Job (org.apache.hadoop.mapreduce.v2.app.job.Job)2 StramWSFilterInitializer (com.datatorrent.stram.security.StramWSFilterInitializer)1 StramWebApp (com.datatorrent.stram.webapp.StramWebApp)1 ServletModule (com.google.inject.servlet.ServletModule)1 ClientResponse (com.sun.jersey.api.client.ClientResponse)1 WebResource (com.sun.jersey.api.client.WebResource)1 IOException (java.io.IOException)1 InetSocketAddress (java.net.InetSocketAddress)1 URI (java.net.URI)1 HttpServletResponse (javax.servlet.http.HttpServletResponse)1 Path (org.apache.hadoop.fs.Path)1 JobConf (org.apache.hadoop.mapred.JobConf)1 JobId (org.apache.hadoop.mapreduce.v2.api.records.JobId)1 AppContext (org.apache.hadoop.mapreduce.v2.app.AppContext)1 Task (org.apache.hadoop.mapreduce.v2.app.job.Task)1 HistoryContext (org.apache.hadoop.mapreduce.v2.hs.HistoryContext)1 MockHistoryContext (org.apache.hadoop.mapreduce.v2.hs.MockHistoryContext)1 YarnConfiguration (org.apache.hadoop.yarn.conf.YarnConfiguration)1