Search in sources :

Example 6 with Upload

use of com.amazonaws.services.s3.transfer.Upload in project airpal by airbnb.

the class S3FilePersistor method persist.

@Override
public URI persist(JobOutputBuilder outputBuilder, Job job) {
    File file = checkNotNull(outputBuilder.build(), "output builder resulting file was null");
    val objectMetaData = new ObjectMetadata();
    objectMetaData.setContentLength(file.length());
    objectMetaData.setContentType(MediaType.CSV_UTF_8.toString());
    if (compressedOutput) {
        objectMetaData.setContentEncoding("gzip");
    }
    val putRequest = new PutObjectRequest(outputBucket, getOutputKey(file.getName()), file).withMetadata(objectMetaData);
    try {
        s3Client.putObject(putRequest);
        return UriBuilder.fromPath("/api/s3/{filename}").build(file.getName());
    } catch (AmazonClientException e) {
        throw new ExecutionClient.ExecutionFailureException(job, "Could not upload CSV to S3", e);
    } finally {
        outputBuilder.delete();
    }
}
Also used : lombok.val(lombok.val) AmazonClientException(com.amazonaws.AmazonClientException) ExecutionClient(com.airbnb.airpal.core.execution.ExecutionClient) File(java.io.File) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest)

Example 7 with Upload

use of com.amazonaws.services.s3.transfer.Upload in project ice by Netflix.

the class MapDb method upload.

void upload() {
    AmazonS3Client s3Client = AwsUtils.getAmazonS3Client();
    File dir = new File(config.localDir);
    File[] files = dir.listFiles(new FilenameFilter() {

        public boolean accept(File file, String fileName) {
            return fileName.startsWith(dbName);
        }
    });
    for (File file : files) s3Client.putObject(config.workS3BucketName, config.workS3BucketPrefix + file.getName(), file);
    for (File file : files) s3Client.putObject(config.workS3BucketName, config.workS3BucketPrefix + "copy" + file.getName(), file);
}
Also used : FilenameFilter(java.io.FilenameFilter) AmazonS3Client(com.amazonaws.services.s3.AmazonS3Client) File(java.io.File)

Example 8 with Upload

use of com.amazonaws.services.s3.transfer.Upload in project alluxio by Alluxio.

the class S3AOutputStreamTest method before.

/**
   * Sets the properties and configuration before each test runs.
   */
@Before
public void before() throws Exception {
    mFile = Mockito.mock(File.class);
    mLocalOutputStream = Mockito.mock(BufferedOutputStream.class);
    TransferManager manager = Mockito.mock(TransferManager.class);
    Upload result = Mockito.mock(Upload.class);
    Mockito.when(manager.upload(Mockito.any(PutObjectRequest.class))).thenReturn(result);
    PowerMockito.whenNew(BufferedOutputStream.class).withArguments(Mockito.any(DigestOutputStream.class)).thenReturn(mLocalOutputStream);
    PowerMockito.whenNew(File.class).withArguments(Mockito.anyString()).thenReturn(mFile);
    FileOutputStream outputStream = PowerMockito.mock(FileOutputStream.class);
    PowerMockito.whenNew(FileOutputStream.class).withArguments(mFile).thenReturn(outputStream);
    mStream = new S3AOutputStream(BUCKET_NAME, KEY, manager);
}
Also used : TransferManager(com.amazonaws.services.s3.transfer.TransferManager) FileOutputStream(java.io.FileOutputStream) Upload(com.amazonaws.services.s3.transfer.Upload) File(java.io.File) BufferedOutputStream(java.io.BufferedOutputStream) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest) Before(org.junit.Before)

Example 9 with Upload

use of com.amazonaws.services.s3.transfer.Upload in project alluxio by Alluxio.

the class S3AOutputStream method close.

@Override
public void close() throws IOException {
    if (mClosed) {
        return;
    }
    mLocalOutputStream.close();
    String path = getUploadPath();
    try {
        // Generate the object metadata by setting server side encryption, md5 checksum, the file
        // length, and encoding as octet stream since no assumptions are made about the file type
        ObjectMetadata meta = new ObjectMetadata();
        if (SSE_ENABLED) {
            meta.setSSEAlgorithm(ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
        }
        if (mHash != null) {
            meta.setContentMD5(new String(Base64.encode(mHash.digest())));
        }
        meta.setContentLength(mFile.length());
        meta.setContentEncoding(Mimetypes.MIMETYPE_OCTET_STREAM);
        // Generate the put request and wait for the transfer manager to complete the upload, then
        // delete the temporary file on the local machine
        PutObjectRequest putReq = new PutObjectRequest(mBucketName, path, mFile).withMetadata(meta);
        mManager.upload(putReq).waitForUploadResult();
        if (!mFile.delete()) {
            LOG.error("Failed to delete temporary file @ {}", mFile.getPath());
        }
    } catch (Exception e) {
        LOG.error("Failed to upload {}. Temporary file @ {}", path, mFile.getPath());
        throw new IOException(e);
    }
    // Set the closed flag, close can be retried until mFile.delete is called successfully
    mClosed = true;
}
Also used : IOException(java.io.IOException) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest) IOException(java.io.IOException) NoSuchAlgorithmException(java.security.NoSuchAlgorithmException)

Example 10 with Upload

use of com.amazonaws.services.s3.transfer.Upload in project jackrabbit-oak by apache.

the class S3Backend method write.

private void write(DataIdentifier identifier, File file, boolean asyncUpload, AsyncUploadCallback callback) throws DataStoreException {
    String key = getKeyName(identifier);
    ObjectMetadata objectMetaData = null;
    long start = System.currentTimeMillis();
    ClassLoader contextClassLoader = Thread.currentThread().getContextClassLoader();
    try {
        Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
        // check if the same record already exists
        try {
            objectMetaData = s3service.getObjectMetadata(bucket, key);
        } catch (AmazonServiceException ase) {
            if (!(ase.getStatusCode() == 404 || ase.getStatusCode() == 403)) {
                throw ase;
            }
        }
        if (objectMetaData != null) {
            long l = objectMetaData.getContentLength();
            if (l != file.length()) {
                throw new DataStoreException("Collision: " + key + " new length: " + file.length() + " old length: " + l);
            }
            LOG.debug("[{}]'s exists, lastmodified = [{}]", key, objectMetaData.getLastModified().getTime());
            CopyObjectRequest copReq = new CopyObjectRequest(bucket, key, bucket, key);
            copReq.setNewObjectMetadata(objectMetaData);
            Copy copy = tmx.copy(s3ReqDecorator.decorate(copReq));
            try {
                copy.waitForCopyResult();
                LOG.debug("lastModified of [{}] updated successfully.", identifier);
                if (callback != null) {
                    callback.onSuccess(new AsyncUploadResult(identifier, file));
                }
            } catch (Exception e2) {
                AsyncUploadResult asyncUpRes = new AsyncUploadResult(identifier, file);
                asyncUpRes.setException(e2);
                if (callback != null) {
                    callback.onAbort(asyncUpRes);
                }
                throw new DataStoreException("Could not upload " + key, e2);
            }
        }
        if (objectMetaData == null) {
            try {
                // start multipart parallel upload using amazon sdk
                Upload up = tmx.upload(s3ReqDecorator.decorate(new PutObjectRequest(bucket, key, file)));
                // wait for upload to finish
                if (asyncUpload) {
                    up.addProgressListener(new S3UploadProgressListener(up, identifier, file, callback));
                    LOG.debug("added upload progress listener to identifier [{}]", identifier);
                } else {
                    up.waitForUploadResult();
                    LOG.debug("synchronous upload to identifier [{}] completed.", identifier);
                    if (callback != null) {
                        callback.onSuccess(new AsyncUploadResult(identifier, file));
                    }
                }
            } catch (Exception e2) {
                AsyncUploadResult asyncUpRes = new AsyncUploadResult(identifier, file);
                asyncUpRes.setException(e2);
                if (callback != null) {
                    callback.onAbort(asyncUpRes);
                }
                throw new DataStoreException("Could not upload " + key, e2);
            }
        }
    } finally {
        if (contextClassLoader != null) {
            Thread.currentThread().setContextClassLoader(contextClassLoader);
        }
    }
    LOG.debug("write of [{}], length=[{}], in async mode [{}], in [{}]ms", new Object[] { identifier, file.length(), asyncUpload, (System.currentTimeMillis() - start) });
}
Also used : AsyncUploadResult(org.apache.jackrabbit.core.data.AsyncUploadResult) DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) Upload(com.amazonaws.services.s3.transfer.Upload) AmazonServiceException(com.amazonaws.AmazonServiceException) AmazonClientException(com.amazonaws.AmazonClientException) DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) IOException(java.io.IOException) CopyObjectRequest(com.amazonaws.services.s3.model.CopyObjectRequest) Copy(com.amazonaws.services.s3.transfer.Copy) AmazonServiceException(com.amazonaws.AmazonServiceException) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest)

Aggregations

PutObjectRequest (com.amazonaws.services.s3.model.PutObjectRequest)19 ObjectMetadata (com.amazonaws.services.s3.model.ObjectMetadata)18 Upload (com.amazonaws.services.s3.transfer.Upload)18 AmazonClientException (com.amazonaws.AmazonClientException)11 IOException (java.io.IOException)11 File (java.io.File)8 DataStoreException (org.apache.jackrabbit.core.data.DataStoreException)7 AmazonServiceException (com.amazonaws.AmazonServiceException)6 PartETag (com.amazonaws.services.s3.model.PartETag)6 InitiateMultipartUploadRequest (com.amazonaws.services.s3.model.InitiateMultipartUploadRequest)5 InitiateMultipartUploadResult (com.amazonaws.services.s3.model.InitiateMultipartUploadResult)5 PutObjectResult (com.amazonaws.services.s3.model.PutObjectResult)5 InputStream (java.io.InputStream)5 InterruptedIOException (java.io.InterruptedIOException)5 CompleteMultipartUploadRequest (com.amazonaws.services.s3.model.CompleteMultipartUploadRequest)4 S3Object (com.amazonaws.services.s3.model.S3Object)4 UploadPartRequest (com.amazonaws.services.s3.model.UploadPartRequest)4 ByteArrayInputStream (java.io.ByteArrayInputStream)4 ArrayList (java.util.ArrayList)4 AmazonS3 (com.amazonaws.services.s3.AmazonS3)3