Search in sources :

Example 1 with HttpPutFailedException

use of org.apache.hadoop.hdfs.server.common.HttpPutFailedException in project hadoop by apache.

the class TransferFsImage method uploadImageFromStorage.

/**
   * Requests that the NameNode download an image from this node.  Allows for
   * optional external cancelation.
   *
   * @param fsName the http address for the remote NN
   * @param conf Configuration
   * @param storage the storage directory to transfer the image from
   * @param nnf the NameNodeFile type of the image
   * @param txid the transaction ID of the image to be uploaded
   * @param canceler optional canceler to check for abort of upload
   * @throws IOException if there is an I/O error or cancellation
   */
public static TransferResult uploadImageFromStorage(URL fsName, Configuration conf, NNStorage storage, NameNodeFile nnf, long txid, Canceler canceler) throws IOException {
    URL url = new URL(fsName, ImageServlet.PATH_SPEC);
    long startTime = Time.monotonicNow();
    try {
        uploadImage(url, conf, storage, nnf, txid, canceler);
    } catch (HttpPutFailedException e) {
        // translate the error code to a result, which is a bit more obvious in usage
        TransferResult result = TransferResult.getResultForCode(e.getResponseCode());
        if (result.shouldReThrowException) {
            throw e;
        }
        return result;
    }
    double xferSec = Math.max(((float) (Time.monotonicNow() - startTime)) / 1000.0, 0.001);
    LOG.info("Uploaded image with txid " + txid + " to namenode at " + fsName + " in " + xferSec + " seconds");
    return TransferResult.SUCCESS;
}
Also used : HttpPutFailedException(org.apache.hadoop.hdfs.server.common.HttpPutFailedException) URL(java.net.URL)

Example 2 with HttpPutFailedException

use of org.apache.hadoop.hdfs.server.common.HttpPutFailedException in project hadoop by apache.

the class TransferFsImage method uploadImage.

/*
   * Uploads the imagefile using HTTP PUT method
   */
private static void uploadImage(URL url, Configuration conf, NNStorage storage, NameNodeFile nnf, long txId, Canceler canceler) throws IOException {
    File imageFile = storage.findImageFile(nnf, txId);
    if (imageFile == null) {
        throw new IOException("Could not find image with txid " + txId);
    }
    HttpURLConnection connection = null;
    try {
        URIBuilder uriBuilder = new URIBuilder(url.toURI());
        // write all params for image upload request as query itself.
        // Request body contains the image to be uploaded.
        Map<String, String> params = ImageServlet.getParamsForPutImage(storage, txId, imageFile.length(), nnf);
        for (Entry<String, String> entry : params.entrySet()) {
            uriBuilder.addParameter(entry.getKey(), entry.getValue());
        }
        URL urlWithParams = uriBuilder.build().toURL();
        connection = (HttpURLConnection) connectionFactory.openConnection(urlWithParams, UserGroupInformation.isSecurityEnabled());
        // Set the request to PUT
        connection.setRequestMethod("PUT");
        connection.setDoOutput(true);
        int chunkSize = conf.getInt(DFSConfigKeys.DFS_IMAGE_TRANSFER_CHUNKSIZE_KEY, DFSConfigKeys.DFS_IMAGE_TRANSFER_CHUNKSIZE_DEFAULT);
        if (imageFile.length() > chunkSize) {
            // using chunked streaming mode to support upload of 2GB+ files and to
            // avoid internal buffering.
            // this mode should be used only if more than chunkSize data is present
            // to upload. otherwise upload may not happen sometimes.
            connection.setChunkedStreamingMode(chunkSize);
        }
        setTimeout(connection);
        // set headers for verification
        ImageServlet.setVerificationHeadersForPut(connection, imageFile);
        // Write the file to output stream.
        writeFileToPutRequest(conf, connection, imageFile, canceler);
        int responseCode = connection.getResponseCode();
        if (responseCode != HttpURLConnection.HTTP_OK) {
            throw new HttpPutFailedException(String.format("Image uploading failed, status: %d, url: %s, message: %s", responseCode, urlWithParams, connection.getResponseMessage()), responseCode);
        }
    } catch (AuthenticationException | URISyntaxException e) {
        throw new IOException(e);
    } finally {
        if (connection != null) {
            connection.disconnect();
        }
    }
}
Also used : AuthenticationException(org.apache.hadoop.security.authentication.client.AuthenticationException) HttpPutFailedException(org.apache.hadoop.hdfs.server.common.HttpPutFailedException) IOException(java.io.IOException) URISyntaxException(java.net.URISyntaxException) URL(java.net.URL) URIBuilder(org.apache.http.client.utils.URIBuilder) HttpURLConnection(java.net.HttpURLConnection) File(java.io.File) NameNodeFile(org.apache.hadoop.hdfs.server.namenode.NNStorage.NameNodeFile)

Aggregations

URL (java.net.URL)2 HttpPutFailedException (org.apache.hadoop.hdfs.server.common.HttpPutFailedException)2 File (java.io.File)1 IOException (java.io.IOException)1 HttpURLConnection (java.net.HttpURLConnection)1 URISyntaxException (java.net.URISyntaxException)1 NameNodeFile (org.apache.hadoop.hdfs.server.namenode.NNStorage.NameNodeFile)1 AuthenticationException (org.apache.hadoop.security.authentication.client.AuthenticationException)1 URIBuilder (org.apache.http.client.utils.URIBuilder)1