Search in sources :

Example 1 with HttpServer2

use of org.apache.hadoop.http.HttpServer2 in project hadoop by apache.

the class TimelineReaderServer method startTimelineReaderWebApp.

private void startTimelineReaderWebApp() {
    Configuration conf = getConfig();
    String bindAddress = WebAppUtils.getWebAppBindURL(conf, YarnConfiguration.TIMELINE_SERVICE_BIND_HOST, WebAppUtils.getTimelineReaderWebAppURL(conf));
    LOG.info("Instantiating TimelineReaderWebApp at " + bindAddress);
    boolean enableCorsFilter = conf.getBoolean(YarnConfiguration.TIMELINE_SERVICE_HTTP_CROSS_ORIGIN_ENABLED, YarnConfiguration.TIMELINE_SERVICE_HTTP_CROSS_ORIGIN_ENABLED_DEFAULT);
    // setup CORS
    if (enableCorsFilter) {
        conf.setBoolean(HttpCrossOriginFilterInitializer.PREFIX + HttpCrossOriginFilterInitializer.ENABLED_SUFFIX, true);
    }
    try {
        HttpServer2.Builder builder = new HttpServer2.Builder().setName("timeline").setConf(conf).addEndpoint(URI.create("http://" + bindAddress));
        readerWebServer = builder.build();
        setupOptions(conf);
        readerWebServer.addJerseyResourcePackage(TimelineReaderWebServices.class.getPackage().getName() + ";" + GenericExceptionHandler.class.getPackage().getName() + ";" + YarnJacksonJaxbJsonProvider.class.getPackage().getName(), "/*");
        readerWebServer.setAttribute(TIMELINE_READER_MANAGER_ATTR, timelineReaderManager);
        readerWebServer.start();
    } catch (Exception e) {
        String msg = "TimelineReaderWebApp failed to start.";
        LOG.error(msg, e);
        throw new YarnRuntimeException(msg, e);
    }
}
Also used : YarnRuntimeException(org.apache.hadoop.yarn.exceptions.YarnRuntimeException) YarnJacksonJaxbJsonProvider(org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) Configuration(org.apache.hadoop.conf.Configuration) GenericExceptionHandler(org.apache.hadoop.yarn.webapp.GenericExceptionHandler) HttpServer2(org.apache.hadoop.http.HttpServer2) YarnException(org.apache.hadoop.yarn.exceptions.YarnException) YarnRuntimeException(org.apache.hadoop.yarn.exceptions.YarnRuntimeException)

Example 2 with HttpServer2

use of org.apache.hadoop.http.HttpServer2 in project hadoop by apache.

the class WebAppProxy method serviceStart.

@Override
protected void serviceStart() throws Exception {
    try {
        Configuration conf = getConfig();
        HttpServer2.Builder b = new HttpServer2.Builder().setName("proxy").addEndpoint(URI.create(WebAppUtils.getHttpSchemePrefix(conf) + bindAddress + ":" + port)).setFindPort(port == 0).setConf(getConfig()).setACL(acl);
        if (YarnConfiguration.useHttps(conf)) {
            WebAppUtils.loadSslConfiguration(b);
        }
        proxyServer = b.build();
        proxyServer.addServlet(ProxyUriUtils.PROXY_SERVLET_NAME, ProxyUriUtils.PROXY_PATH_SPEC, WebAppProxyServlet.class);
        proxyServer.setAttribute(FETCHER_ATTRIBUTE, fetcher);
        proxyServer.setAttribute(IS_SECURITY_ENABLED_ATTRIBUTE, isSecurityEnabled);
        proxyServer.setAttribute(PROXY_HOST_ATTRIBUTE, proxyHost);
        proxyServer.start();
    } catch (IOException e) {
        LOG.error("Could not start proxy web server", e);
        throw e;
    }
    super.serviceStart();
}
Also used : YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) Configuration(org.apache.hadoop.conf.Configuration) IOException(java.io.IOException) HttpServer2(org.apache.hadoop.http.HttpServer2)

Example 3 with HttpServer2

use of org.apache.hadoop.http.HttpServer2 in project hadoop by apache.

the class ResourceManager method httpServerTemplateForRM.

/**
   * Return a HttpServer.Builder that the journalnode / namenode / secondary
   * namenode can use to initialize their HTTP / HTTPS server.
   *
   * @param conf configuration object
   * @param httpAddr HTTP address
   * @param httpsAddr HTTPS address
   * @param name  Name of the server
   * @throws IOException from Builder
   * @return builder object
   */
public static HttpServer2.Builder httpServerTemplateForRM(Configuration conf, final InetSocketAddress httpAddr, final InetSocketAddress httpsAddr, String name) throws IOException {
    HttpServer2.Builder builder = new HttpServer2.Builder().setName(name).setConf(conf).setSecurityEnabled(false);
    if (httpAddr.getPort() == 0) {
        builder.setFindPort(true);
    }
    URI uri = URI.create("http://" + NetUtils.getHostPortString(httpAddr));
    builder.addEndpoint(uri);
    LOG.info("Starting Web-server for " + name + " at: " + uri);
    return builder;
}
Also used : HttpServer2(org.apache.hadoop.http.HttpServer2) URI(java.net.URI)

Example 4 with HttpServer2

use of org.apache.hadoop.http.HttpServer2 in project hadoop by apache.

the class TestJobEndNotifier method setUp.

public void setUp() throws Exception {
    new File(System.getProperty("build.webapps", "build/webapps") + "/test").mkdirs();
    server = new HttpServer2.Builder().setName("test").addEndpoint(URI.create("http://localhost:0")).setFindPort(true).build();
    server.addServlet("delay", "/delay", DelayServlet.class);
    server.addServlet("jobend", "/jobend", JobEndServlet.class);
    server.addServlet("fail", "/fail", FailServlet.class);
    server.start();
    int port = server.getConnectorAddress(0).getPort();
    baseUrl = new URL("http://localhost:" + port + "/");
    JobEndServlet.calledTimes = 0;
    JobEndServlet.requestUri = null;
    DelayServlet.calledTimes = 0;
    FailServlet.calledTimes = 0;
}
Also used : HttpServer2(org.apache.hadoop.http.HttpServer2) File(java.io.File) URL(java.net.URL)

Example 5 with HttpServer2

use of org.apache.hadoop.http.HttpServer2 in project hadoop by apache.

the class TestTransferFsImage method testImageUploadTimeout.

/**
   * Test to verify the timeout of Image upload
   */
@Test(timeout = 10000)
public void testImageUploadTimeout() throws Exception {
    Configuration conf = new HdfsConfiguration();
    NNStorage mockStorage = Mockito.mock(NNStorage.class);
    HttpServer2 testServer = HttpServerFunctionalTest.createServer("hdfs");
    try {
        testServer.addServlet("ImageTransfer", ImageServlet.PATH_SPEC, TestImageTransferServlet.class);
        testServer.start();
        URL serverURL = HttpServerFunctionalTest.getServerURL(testServer);
        // set the timeout here, otherwise it will take default.
        TransferFsImage.timeout = 2000;
        File tmpDir = new File(new FileSystemTestHelper().getTestRootDir());
        tmpDir.mkdirs();
        File mockImageFile = File.createTempFile("image", "", tmpDir);
        FileOutputStream imageFile = new FileOutputStream(mockImageFile);
        imageFile.write("data".getBytes());
        imageFile.close();
        Mockito.when(mockStorage.findImageFile(Mockito.any(NameNodeFile.class), Mockito.anyLong())).thenReturn(mockImageFile);
        Mockito.when(mockStorage.toColonSeparatedString()).thenReturn("storage:info:string");
        try {
            TransferFsImage.uploadImageFromStorage(serverURL, conf, mockStorage, NameNodeFile.IMAGE, 1L);
            fail("TransferImage Should fail with timeout");
        } catch (SocketTimeoutException e) {
            assertEquals("Upload should timeout", "Read timed out", e.getMessage());
        }
    } finally {
        testServer.stop();
    }
}
Also used : FileSystemTestHelper(org.apache.hadoop.fs.FileSystemTestHelper) SocketTimeoutException(java.net.SocketTimeoutException) Configuration(org.apache.hadoop.conf.Configuration) HdfsConfiguration(org.apache.hadoop.hdfs.HdfsConfiguration) FileOutputStream(java.io.FileOutputStream) NameNodeFile(org.apache.hadoop.hdfs.server.namenode.NNStorage.NameNodeFile) HdfsConfiguration(org.apache.hadoop.hdfs.HdfsConfiguration) HttpServer2(org.apache.hadoop.http.HttpServer2) File(java.io.File) NameNodeFile(org.apache.hadoop.hdfs.server.namenode.NNStorage.NameNodeFile) URL(java.net.URL) HttpServerFunctionalTest(org.apache.hadoop.http.HttpServerFunctionalTest) Test(org.junit.Test)

Aggregations

HttpServer2 (org.apache.hadoop.http.HttpServer2)20 Configuration (org.apache.hadoop.conf.Configuration)7 Test (org.junit.Test)6 IOException (java.io.IOException)4 InetSocketAddress (java.net.InetSocketAddress)4 HttpConfig (org.apache.hadoop.http.HttpConfig)4 YarnConfiguration (org.apache.hadoop.yarn.conf.YarnConfiguration)4 File (java.io.File)3 SocketTimeoutException (java.net.SocketTimeoutException)3 URI (java.net.URI)3 URL (java.net.URL)3 HttpServerFunctionalTest (org.apache.hadoop.http.HttpServerFunctionalTest)3 JobConf (org.apache.hadoop.mapred.JobConf)3 JobImpl (org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl)3 YarnRuntimeException (org.apache.hadoop.yarn.exceptions.YarnRuntimeException)3 SocketException (java.net.SocketException)2 HdfsConfiguration (org.apache.hadoop.hdfs.HdfsConfiguration)2 JobEvent (org.apache.hadoop.mapreduce.v2.app.job.event.JobEvent)2 AccessControlList (org.apache.hadoop.security.authorize.AccessControlList)2 YarnException (org.apache.hadoop.yarn.exceptions.YarnException)2