Search in sources :

Example 71 with DataStoreException

use of org.apache.jackrabbit.core.data.DataStoreException in project jackrabbit-oak by apache.

the class S3Backend method addMetadataRecord.

@Override
public void addMetadataRecord(File input, String name) throws DataStoreException {
    ClassLoader contextClassLoader = Thread.currentThread().getContextClassLoader();
    try {
        Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
        Upload upload = tmx.upload(s3ReqDecorator.decorate(new PutObjectRequest(bucket, addMetaKeyPrefix(name), input)));
        upload.waitForUploadResult();
    } catch (InterruptedException e) {
        LOG.error("Exception in uploading metadata file {}", new Object[] { input, e });
        throw new DataStoreException("Error in uploading metadata file", e);
    } finally {
        if (contextClassLoader != null) {
            Thread.currentThread().setContextClassLoader(contextClassLoader);
        }
    }
}
Also used : DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) Upload(com.amazonaws.services.s3.transfer.Upload) S3Object(com.amazonaws.services.s3.model.S3Object) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest)

Example 72 with DataStoreException

use of org.apache.jackrabbit.core.data.DataStoreException in project jackrabbit-oak by apache.

the class S3Backend method addMetadataRecord.

public void addMetadataRecord(final InputStream input, final String name) throws DataStoreException {
    ClassLoader contextClassLoader = Thread.currentThread().getContextClassLoader();
    try {
        Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
        Upload upload = tmx.upload(s3ReqDecorator.decorate(new PutObjectRequest(bucket, addMetaKeyPrefix(name), input, new ObjectMetadata())));
        upload.waitForUploadResult();
    } catch (InterruptedException e) {
        LOG.error("Error in uploading", e);
        throw new DataStoreException("Error in uploading", e);
    } finally {
        if (contextClassLoader != null) {
            Thread.currentThread().setContextClassLoader(contextClassLoader);
        }
    }
}
Also used : DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) Upload(com.amazonaws.services.s3.transfer.Upload) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest)

Example 73 with DataStoreException

use of org.apache.jackrabbit.core.data.DataStoreException in project jackrabbit-oak by apache.

the class S3Backend method init.

/**
     * Initialize S3Backend. It creates AmazonS3Client and TransferManager from
     * aws.properties. It creates S3 bucket if it doesn't pre-exist in S3.
     */
@Override
public void init(CachingDataStore store, String homeDir, String config) throws DataStoreException {
    Properties initProps = null;
    //over config provided via file based config
    if (this.properties != null) {
        initProps = this.properties;
    } else {
        if (config == null) {
            config = Utils.DEFAULT_CONFIG_FILE;
        }
        try {
            initProps = Utils.readConfig(config);
        } catch (IOException e) {
            throw new DataStoreException("Could not initialize S3 from " + config, e);
        }
        this.properties = initProps;
    }
    init(store, homeDir, initProps);
}
Also used : DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) IOException(java.io.IOException) Properties(java.util.Properties)

Example 74 with DataStoreException

use of org.apache.jackrabbit.core.data.DataStoreException in project jackrabbit-oak by apache.

the class S3Backend method deleteRecord.

@Override
public void deleteRecord(DataIdentifier identifier) throws DataStoreException {
    long start = System.currentTimeMillis();
    String key = getKeyName(identifier);
    ClassLoader contextClassLoader = Thread.currentThread().getContextClassLoader();
    try {
        Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
        s3service.deleteObject(bucket, key);
        LOG.debug("Identifier [{}] deleted. It took [{}]ms.", new Object[] { identifier, (System.currentTimeMillis() - start) });
    } catch (AmazonServiceException e) {
        throw new DataStoreException("Could not delete dataIdentifier " + identifier, e);
    } finally {
        if (contextClassLoader != null) {
            Thread.currentThread().setContextClassLoader(contextClassLoader);
        }
    }
}
Also used : DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) AmazonServiceException(com.amazonaws.AmazonServiceException)

Example 75 with DataStoreException

use of org.apache.jackrabbit.core.data.DataStoreException in project jackrabbit-oak by apache.

the class UploadStagingCacheTest method testAddUploadException.

/**
     * Stage file unsuccessful upload.
     * @throws Exception
     */
@Test
public void testAddUploadException() throws Exception {
    final AtomicInteger count = new AtomicInteger(0);
    TestStagingUploader secondTimeUploader = new TestStagingUploader(folder.newFolder()) {

        @Override
        public void write(String id, File f) throws DataStoreException {
            if (count.get() == 0) {
                throw new DataStoreException("Error in writing blob");
            }
            super.write(id, f);
        }
    };
    // initialize staging cache using the mocked uploader
    init(2, secondTimeUploader, null);
    // Add load
    List<ListenableFuture<Integer>> futures = put(folder);
    //start
    taskLatch.countDown();
    callbackLatch.countDown();
    waitFinish(futures);
    // assert file retrieved from staging cache
    File ret = stagingCache.getIfPresent(ID_PREFIX + 0);
    assertTrue(Files.equal(copyToFile(randomStream(0, 4 * 1024), folder.newFile()), ret));
    assertEquals(1, stagingCache.getStats().getLoadCount());
    assertEquals(1, stagingCache.getStats().getLoadSuccessCount());
    assertCacheStats(stagingCache, 1, 4 * 1024, 1, 1);
    // Retry upload and wait for finish
    count.incrementAndGet();
    ScheduledFuture<?> scheduledFuture = removeExecutor.schedule(stagingCache.new RetryJob(), 0, TimeUnit.MILLISECONDS);
    scheduledFuture.get();
    afterExecuteLatch.await();
    // Now uploaded
    ret = stagingCache.getIfPresent(ID_PREFIX + 0);
    assertNull(ret);
    assertTrue(Files.equal(copyToFile(randomStream(0, 4 * 1024), folder.newFile()), secondTimeUploader.read(ID_PREFIX + 0)));
}
Also used : DataStoreException(org.apache.jackrabbit.core.data.DataStoreException) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) ListenableFuture(com.google.common.util.concurrent.ListenableFuture) File(java.io.File) Test(org.junit.Test)

Aggregations

DataStoreException (org.apache.jackrabbit.core.data.DataStoreException)85 IOException (java.io.IOException)35 AmazonServiceException (com.amazonaws.AmazonServiceException)28 ObjectMetadata (com.amazonaws.services.s3.model.ObjectMetadata)18 DataIdentifier (org.apache.jackrabbit.core.data.DataIdentifier)15 File (java.io.File)14 AmazonClientException (com.amazonaws.AmazonClientException)12 StorageException (com.microsoft.azure.storage.StorageException)11 InputStream (java.io.InputStream)9 URISyntaxException (java.net.URISyntaxException)9 RepositoryException (javax.jcr.RepositoryException)9 CopyObjectRequest (com.amazonaws.services.s3.model.CopyObjectRequest)7 PutObjectRequest (com.amazonaws.services.s3.model.PutObjectRequest)7 Copy (com.amazonaws.services.s3.transfer.Copy)7 Upload (com.amazonaws.services.s3.transfer.Upload)7 FileObject (org.apache.commons.vfs2.FileObject)7 FileSystemException (org.apache.commons.vfs2.FileSystemException)7 BufferedInputStream (java.io.BufferedInputStream)6 NoSuchAlgorithmException (java.security.NoSuchAlgorithmException)6 S3Object (com.amazonaws.services.s3.model.S3Object)5