Search in sources :

Example 6 with StorageObject

use of org.jets3t.service.model.StorageObject in project hadoop by apache.

the class Jets3tNativeFileSystemStore method storeLargeFile.

public void storeLargeFile(String key, File file, byte[] md5Hash) throws IOException {
    S3Object object = new S3Object(key);
    object.setDataInputFile(file);
    object.setContentType("binary/octet-stream");
    object.setContentLength(file.length());
    object.setServerSideEncryptionAlgorithm(serverSideEncryptionAlgorithm);
    if (md5Hash != null) {
        object.setMd5Hash(md5Hash);
    }
    List<StorageObject> objectsToUploadAsMultipart = new ArrayList<StorageObject>();
    objectsToUploadAsMultipart.add(object);
    MultipartUtils mpUtils = new MultipartUtils(multipartBlockSize);
    try {
        mpUtils.uploadObjects(bucket.getName(), s3Service, objectsToUploadAsMultipart, null);
    } catch (Exception e) {
        handleException(e, key);
    }
}
Also used : StorageObject(org.jets3t.service.model.StorageObject) ArrayList(java.util.ArrayList) S3Object(org.jets3t.service.model.S3Object) MultipartUtils(org.jets3t.service.utils.MultipartUtils) ServiceException(org.jets3t.service.ServiceException) HttpException(org.jets3t.service.impl.rest.HttpException) S3ServiceException(org.jets3t.service.S3ServiceException) IOException(java.io.IOException) EOFException(java.io.EOFException) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(org.apache.hadoop.security.AccessControlException)

Example 7 with StorageObject

use of org.jets3t.service.model.StorageObject in project alluxio by Alluxio.

the class S3OutputStream method close.

@Override
public void close() throws IOException {
    if (mClosed.getAndSet(true)) {
        return;
    }
    mLocalOutputStream.close();
    try {
        S3Object obj = new S3Object(mKey);
        obj.setBucketName(mBucketName);
        obj.setDataInputFile(mFile);
        obj.setContentLength(mFile.length());
        obj.setContentEncoding(Mimetypes.MIMETYPE_BINARY_OCTET_STREAM);
        if (mHash != null) {
            obj.setMd5Hash(mHash.digest());
        } else {
            LOG.warn("MD5 was not computed for: {}", mKey);
        }
        if (MULTIPART_UTIL.isFileLargerThanMaxPartSize(mFile)) {
            // Big object will be split into parts and uploaded to S3 in parallel.
            List<StorageObject> objectsToUploadAsMultipart = new ArrayList<>();
            objectsToUploadAsMultipart.add(obj);
            MULTIPART_UTIL.uploadObjects(mBucketName, mClient, objectsToUploadAsMultipart, null);
        } else {
            // Avoid uploading file with Multipart if it's not necessary to save the
            // extra overhead.
            mClient.putObject(mBucketName, obj);
        }
        if (!mFile.delete()) {
            LOG.error("Failed to delete temporary file @ {}", mFile.getPath());
        }
    } catch (Exception e) {
        LOG.error("Failed to upload {}. Temporary file @ {}", mKey, mFile.getPath());
        throw new IOException(e);
    }
}
Also used : StorageObject(org.jets3t.service.model.StorageObject) ArrayList(java.util.ArrayList) S3Object(org.jets3t.service.model.S3Object) IOException(java.io.IOException) IOException(java.io.IOException) NoSuchAlgorithmException(java.security.NoSuchAlgorithmException)

Example 8 with StorageObject

use of org.jets3t.service.model.StorageObject in project hadoop by apache.

the class Jets3tNativeFileSystemStore method list.

/**
   * list objects
   * @param prefix prefix
   * @param delimiter delimiter
   * @param maxListingLength max no. of entries
   * @param priorLastKey last key in any previous search
   * @return a list of matches
   * @throws IOException on any reported failure
   */
private PartialListing list(String prefix, String delimiter, int maxListingLength, String priorLastKey) throws IOException {
    try {
        if (!prefix.isEmpty() && !prefix.endsWith(PATH_DELIMITER)) {
            prefix += PATH_DELIMITER;
        }
        StorageObjectsChunk chunk = s3Service.listObjectsChunked(bucket.getName(), prefix, delimiter, maxListingLength, priorLastKey);
        FileMetadata[] fileMetadata = new FileMetadata[chunk.getObjects().length];
        for (int i = 0; i < fileMetadata.length; i++) {
            StorageObject object = chunk.getObjects()[i];
            fileMetadata[i] = new FileMetadata(object.getKey(), object.getContentLength(), object.getLastModifiedDate().getTime());
        }
        return new PartialListing(chunk.getPriorLastKey(), fileMetadata, chunk.getCommonPrefixes());
    } catch (ServiceException e) {
        handleException(e, prefix);
        // never returned - keep compiler happy
        return null;
    }
}
Also used : StorageObject(org.jets3t.service.model.StorageObject) ServiceException(org.jets3t.service.ServiceException) S3ServiceException(org.jets3t.service.S3ServiceException) StorageObjectsChunk(org.jets3t.service.StorageObjectsChunk)

Aggregations

StorageObject (org.jets3t.service.model.StorageObject)8 ServiceException (org.jets3t.service.ServiceException)7 IOException (java.io.IOException)6 S3ServiceException (org.jets3t.service.S3ServiceException)4 S3Object (org.jets3t.service.model.S3Object)3 FileNotFoundException (java.io.FileNotFoundException)2 InputStream (java.io.InputStream)2 ArrayList (java.util.ArrayList)2 StorageObjectsChunk (org.jets3t.service.StorageObjectsChunk)2 UOE (io.druid.java.util.common.UOE)1 SegmentLoadingException (io.druid.segment.loading.SegmentLoadingException)1 DataSegment (io.druid.timeline.DataSegment)1 EOFException (java.io.EOFException)1 NoSuchAlgorithmException (java.security.NoSuchAlgorithmException)1 Iterator (java.util.Iterator)1 FileObject (javax.tools.FileObject)1 AccessControlException (org.apache.hadoop.security.AccessControlException)1 HttpException (org.jets3t.service.impl.rest.HttpException)1 MultipartUtils (org.jets3t.service.utils.MultipartUtils)1