Search in sources :

Example 71 with S3FileTransferRequestParamsDto

use of org.finra.herd.model.dto.S3FileTransferRequestParamsDto in project herd by FINRAOS.

the class S3DaoTest method testGetPropertiesHandleGenericException.

@Test
public void testGetPropertiesHandleGenericException() throws Exception {
    S3Operations originalS3Operations = (S3Operations) ReflectionTestUtils.getField(s3Dao, "s3Operations");
    S3Operations mockS3Operations = mock(S3Operations.class);
    ReflectionTestUtils.setField(s3Dao, "s3Operations", mockS3Operations);
    JavaPropertiesHelper originalJavaPropertiesHelper = (JavaPropertiesHelper) ReflectionTestUtils.getField(s3Dao, "javaPropertiesHelper");
    JavaPropertiesHelper mockJavaPropertiesHelper = mock(JavaPropertiesHelper.class);
    ReflectionTestUtils.setField(s3Dao, "javaPropertiesHelper", mockJavaPropertiesHelper);
    try {
        String bucketName = "bucketName";
        String key = "key";
        S3FileTransferRequestParamsDto s3FileTransferRequestParamsDto = new S3FileTransferRequestParamsDto();
        S3Object s3Object = new S3Object();
        s3Object.setObjectContent(new ByteArrayInputStream(new byte[] { 0 }));
        when(mockS3Operations.getS3Object(any(), any())).thenReturn(s3Object);
        when(mockJavaPropertiesHelper.getProperties(any(InputStream.class))).thenThrow(new RuntimeException("message"));
        try {
            s3Dao.getProperties(bucketName, key, s3FileTransferRequestParamsDto);
            fail();
        } catch (Exception e) {
            assertEquals(RuntimeException.class, e.getClass());
            assertEquals("message", e.getMessage());
        }
    } finally {
        ReflectionTestUtils.setField(s3Dao, "s3Operations", originalS3Operations);
        ReflectionTestUtils.setField(s3Dao, "javaPropertiesHelper", originalJavaPropertiesHelper);
    }
}
Also used : JavaPropertiesHelper(org.finra.herd.dao.helper.JavaPropertiesHelper) S3FileTransferRequestParamsDto(org.finra.herd.model.dto.S3FileTransferRequestParamsDto) ByteArrayInputStream(java.io.ByteArrayInputStream) ByteArrayInputStream(java.io.ByteArrayInputStream) InputStream(java.io.InputStream) S3Object(com.amazonaws.services.s3.model.S3Object) MultiObjectDeleteException(com.amazonaws.services.s3.model.MultiObjectDeleteException) ObjectNotFoundException(org.finra.herd.model.ObjectNotFoundException) AmazonServiceException(com.amazonaws.AmazonServiceException) AmazonClientException(com.amazonaws.AmazonClientException) AmazonS3Exception(com.amazonaws.services.s3.model.AmazonS3Exception) IOException(java.io.IOException) Test(org.junit.Test)

Example 72 with S3FileTransferRequestParamsDto

use of org.finra.herd.model.dto.S3FileTransferRequestParamsDto in project herd by FINRAOS.

the class BusinessObjectDataServiceDestroyBusinessObjectDataTest method testDestroyBusinessObjectData.

@Test
public void testDestroyBusinessObjectData() throws Exception {
    // Create a primary partition value that satisfies the retention threshold check.
    String primaryPartitionValue = DateFormatUtils.format(DateUtils.addDays(new Date(), -1 * (RETENTION_PERIOD_DAYS + 1)), AbstractHerdDao.DEFAULT_SINGLE_DAY_DATE_MASK);
    // Build the expected S3 key prefix for test business object data.
    String s3KeyPrefix = getExpectedS3KeyPrefix(BDEF_NAMESPACE, DATA_PROVIDER_NAME, BDEF_NAME, FORMAT_USAGE_CODE, FORMAT_FILE_TYPE_CODE, FORMAT_VERSION, PARTITION_KEY, primaryPartitionValue, null, null, DATA_VERSION);
    // Create S3FileTransferRequestParamsDto to access the S3 bucket location.
    // Since test S3 key prefix represents a directory, we add a trailing '/' character to it.
    S3FileTransferRequestParamsDto s3FileTransferRequestParamsDto = S3FileTransferRequestParamsDto.builder().withS3BucketName(S3_BUCKET_NAME).withS3KeyPrefix(s3KeyPrefix + "/").build();
    // Create an S3 storage with the relative attributes.
    storageDaoTestHelper.createStorageEntity(STORAGE_NAME, StoragePlatformEntity.S3, Arrays.asList(new Attribute(configurationHelper.getProperty(ConfigurationValue.S3_ATTRIBUTE_NAME_BUCKET_NAME), AbstractServiceTest.S3_BUCKET_NAME), new Attribute(configurationHelper.getProperty(ConfigurationValue.S3_ATTRIBUTE_NAME_KEY_PREFIX_VELOCITY_TEMPLATE), AbstractServiceTest.S3_KEY_PREFIX_VELOCITY_TEMPLATE), new Attribute(configurationHelper.getProperty(ConfigurationValue.S3_ATTRIBUTE_NAME_VALIDATE_PATH_PREFIX), Boolean.TRUE.toString()), new Attribute(configurationHelper.getProperty(ConfigurationValue.S3_ATTRIBUTE_NAME_VALIDATE_FILE_EXISTENCE), Boolean.TRUE.toString())));
    // Create a business object data key.
    BusinessObjectDataKey businessObjectDataKey = new BusinessObjectDataKey(BDEF_NAMESPACE, BDEF_NAME, FORMAT_USAGE_CODE, FORMAT_FILE_TYPE_CODE, FORMAT_VERSION, primaryPartitionValue, NO_SUBPARTITION_VALUES, DATA_VERSION);
    // Create and persist a storage unit in the storage.
    StorageUnitEntity storageUnitEntity = storageUnitDaoTestHelper.createStorageUnitEntity(STORAGE_NAME, businessObjectDataKey, LATEST_VERSION_FLAG_SET, BusinessObjectDataStatusEntity.VALID, StorageUnitStatusEntity.ENABLED, NO_STORAGE_DIRECTORY_PATH);
    // Add storage files to the storage unit.
    for (String filePath : LOCAL_FILES) {
        storageFileDaoTestHelper.createStorageFileEntity(storageUnitEntity, s3KeyPrefix + "/" + filePath, FILE_SIZE_1_KB, ROW_COUNT_1000);
    }
    // Get the storage files.
    List<StorageFile> storageFiles = storageFileHelper.createStorageFilesFromEntities(storageUnitEntity.getStorageFiles());
    // Get the business object format entity.
    BusinessObjectFormatEntity businessObjectFormatEntity = storageUnitEntity.getBusinessObjectData().getBusinessObjectFormat();
    // Set the retention information for the business object format, which is the latest version business object format.
    businessObjectFormatEntity.setRetentionType(retentionTypeDao.getRetentionTypeByCode(RetentionTypeEntity.PARTITION_VALUE));
    businessObjectFormatEntity.setRetentionPeriodInDays(RETENTION_PERIOD_DAYS);
    businessObjectFormatDao.saveAndRefresh(businessObjectFormatEntity);
    // Override configuration to specify some settings required for testing.
    Map<String, Object> overrideMap = new HashMap<>();
    overrideMap.put(ConfigurationValue.S3_OBJECT_DELETE_TAG_KEY.getKey(), S3_OBJECT_TAG_KEY);
    overrideMap.put(ConfigurationValue.S3_OBJECT_DELETE_TAG_VALUE.getKey(), S3_OBJECT_TAG_VALUE);
    overrideMap.put(ConfigurationValue.S3_OBJECT_DELETE_ROLE_ARN.getKey(), S3_OBJECT_TAGGER_ROLE_ARN);
    overrideMap.put(ConfigurationValue.S3_OBJECT_DELETE_ROLE_SESSION_NAME.getKey(), S3_OBJECT_TAGGER_ROLE_SESSION_NAME);
    modifyPropertySourceInEnvironment(overrideMap);
    try {
        // Put relative S3 files into the S3 bucket.
        for (StorageFile storageFile : storageFiles) {
            s3Operations.putObject(new PutObjectRequest(S3_BUCKET_NAME, storageFile.getFilePath(), new ByteArrayInputStream(new byte[storageFile.getFileSizeBytes().intValue()]), null), null);
        }
        // Request to destroy business object data.
        BusinessObjectData result = businessObjectDataService.destroyBusinessObjectData(businessObjectDataKey);
        // Validate the result.
        assertNotNull(result);
        assertEquals(storageUnitEntity.getBusinessObjectData().getId(), Integer.valueOf(result.getId()));
        // Validate the status of the storage unit entity.
        assertEquals(StorageUnitStatusEntity.DISABLED, storageUnitEntity.getStatus().getCode());
        // Validate the status of the business object data entity.
        assertEquals(BusinessObjectDataStatusEntity.DELETED, storageUnitEntity.getBusinessObjectData().getStatus().getCode());
        // Validate that all S3 files are now tagged.
        for (StorageFile storageFile : storageFiles) {
            GetObjectTaggingResult getObjectTaggingResult = s3Operations.getObjectTagging(new GetObjectTaggingRequest(S3_BUCKET_NAME, storageFile.getFilePath()), null);
            assertEquals(Arrays.asList(new Tag(S3_OBJECT_TAG_KEY, S3_OBJECT_TAG_VALUE)), getObjectTaggingResult.getTagSet());
        }
    } finally {
        // Delete test files from S3 storage.
        if (!s3Dao.listDirectory(s3FileTransferRequestParamsDto).isEmpty()) {
            s3Dao.deleteDirectory(s3FileTransferRequestParamsDto);
        }
        s3Operations.rollback();
        // Restore the property sources so we don't affect other tests.
        restorePropertySourceInEnvironment();
    }
}
Also used : GetObjectTaggingRequest(com.amazonaws.services.s3.model.GetObjectTaggingRequest) S3FileTransferRequestParamsDto(org.finra.herd.model.dto.S3FileTransferRequestParamsDto) Attribute(org.finra.herd.model.api.xml.Attribute) StorageUnitEntity(org.finra.herd.model.jpa.StorageUnitEntity) HashMap(java.util.HashMap) BusinessObjectData(org.finra.herd.model.api.xml.BusinessObjectData) BusinessObjectDataKey(org.finra.herd.model.api.xml.BusinessObjectDataKey) BusinessObjectFormatEntity(org.finra.herd.model.jpa.BusinessObjectFormatEntity) Date(java.util.Date) GetObjectTaggingResult(com.amazonaws.services.s3.model.GetObjectTaggingResult) ByteArrayInputStream(java.io.ByteArrayInputStream) StorageFile(org.finra.herd.model.api.xml.StorageFile) Tag(com.amazonaws.services.s3.model.Tag) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest) Test(org.junit.Test)

Example 73 with S3FileTransferRequestParamsDto

use of org.finra.herd.model.dto.S3FileTransferRequestParamsDto in project herd by FINRAOS.

the class BusinessObjectDataServiceDeleteBusinessObjectDataTest method cleanEnv.

/**
 * Cleans up the local temp directory and S3 test path that we are using.
 */
@After
public void cleanEnv() throws IOException {
    // Clean up the local directory.
    FileUtils.deleteDirectory(localTempPath.toFile());
    // Clean up the destination S3 folders.
    S3FileTransferRequestParamsDto s3FileTransferRequestParamsDto = s3DaoTestHelper.getTestS3FileTransferRequestParamsDto();
    for (String keyPrefix : Arrays.asList(testS3KeyPrefix, TEST_S3_KEY_PREFIX)) {
        // Since the key prefix represents a directory, we add a trailing '/' character to it.
        s3FileTransferRequestParamsDto.setS3KeyPrefix(keyPrefix + "/");
        s3Dao.deleteDirectory(s3FileTransferRequestParamsDto);
    }
}
Also used : S3FileTransferRequestParamsDto(org.finra.herd.model.dto.S3FileTransferRequestParamsDto) After(org.junit.After)

Example 74 with S3FileTransferRequestParamsDto

use of org.finra.herd.model.dto.S3FileTransferRequestParamsDto in project herd by FINRAOS.

the class BusinessObjectDataServiceRestoreBusinessObjectDataTest method testRestoreBusinessObjectDataNonGlacierStorageClass.

@Test
public void testRestoreBusinessObjectDataNonGlacierStorageClass() throws Exception {
    // Create S3FileTransferRequestParamsDto to access the S3 bucket.
    // Since test S3 key prefix represents a directory, we add a trailing '/' character to it.
    S3FileTransferRequestParamsDto glacierS3FileTransferRequestParamsDto = S3FileTransferRequestParamsDto.builder().withS3BucketName(S3_BUCKET_NAME).withS3KeyPrefix(S3_BUCKET_NAME + "/" + TEST_S3_KEY_PREFIX + "/").build();
    // Create a business object data key.
    BusinessObjectDataKey businessObjectDataKey = new BusinessObjectDataKey(BDEF_NAMESPACE, BDEF_NAME, FORMAT_USAGE_CODE, FORMAT_FILE_TYPE_CODE, FORMAT_VERSION, PARTITION_VALUE, NO_SUBPARTITION_VALUES, DATA_VERSION);
    // Create database entities required for testing.
    BusinessObjectDataEntity businessObjectDataEntity = businessObjectDataServiceTestHelper.createDatabaseEntitiesForInitiateRestoreTesting(businessObjectDataKey);
    // Get the storage unit entity.
    StorageUnitEntity storageUnitEntity = storageUnitDaoHelper.getStorageUnitEntity(STORAGE_NAME, businessObjectDataEntity);
    try {
        // Put relative non-Glacier storage class files into the S3 bucket.
        for (StorageFileEntity storageFileEntity : storageUnitEntity.getStorageFiles()) {
            ObjectMetadata metadata = new ObjectMetadata();
            metadata.setHeader(Headers.STORAGE_CLASS, StorageClass.Standard);
            metadata.setOngoingRestore(false);
            s3Operations.putObject(new PutObjectRequest(S3_BUCKET_NAME, storageFileEntity.getPath(), new ByteArrayInputStream(new byte[storageFileEntity.getFileSizeBytes().intValue()]), metadata), NO_S3_CLIENT);
        }
        // Initiate a restore request for the business object data.
        BusinessObjectData businessObjectData = businessObjectDataService.restoreBusinessObjectData(businessObjectDataKey, EXPIRATION_IN_DAYS);
        // Validate the returned object.
        businessObjectDataServiceTestHelper.validateBusinessObjectData(businessObjectDataEntity.getId(), businessObjectDataKey, LATEST_VERSION_FLAG_SET, BDATA_STATUS, businessObjectData);
        // Validate that the origin storage unit status is RESTORING.
        assertEquals(StorageUnitStatusEntity.RESTORING, storageUnitEntity.getStatus().getCode());
        // Validate that there is still no ongoing restore request for all non-Glacier objects.
        for (StorageFileEntity storageFileEntity : storageUnitEntity.getStorageFiles()) {
            ObjectMetadata objectMetadata = s3Operations.getObjectMetadata(S3_BUCKET_NAME, storageFileEntity.getPath(), NO_S3_CLIENT);
            assertFalse(objectMetadata.getOngoingRestore());
        }
    } finally {
        // Delete test files from S3 storage.
        if (!s3Dao.listDirectory(glacierS3FileTransferRequestParamsDto).isEmpty()) {
            s3Dao.deleteDirectory(glacierS3FileTransferRequestParamsDto);
        }
        s3Operations.rollback();
    }
}
Also used : StorageFileEntity(org.finra.herd.model.jpa.StorageFileEntity) S3FileTransferRequestParamsDto(org.finra.herd.model.dto.S3FileTransferRequestParamsDto) StorageUnitEntity(org.finra.herd.model.jpa.StorageUnitEntity) ByteArrayInputStream(java.io.ByteArrayInputStream) BusinessObjectData(org.finra.herd.model.api.xml.BusinessObjectData) BusinessObjectDataEntity(org.finra.herd.model.jpa.BusinessObjectDataEntity) BusinessObjectDataKey(org.finra.herd.model.api.xml.BusinessObjectDataKey) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest) Test(org.junit.Test)

Example 75 with S3FileTransferRequestParamsDto

use of org.finra.herd.model.dto.S3FileTransferRequestParamsDto in project herd by FINRAOS.

the class S3ServiceTest method testDeleteDirectoryIgnoreException.

@Test
public void testDeleteDirectoryIgnoreException() {
    // Create an S3 file transfer request parameters DTO.
    S3FileTransferRequestParamsDto s3FileTransferRequestParamsDto = new S3FileTransferRequestParamsDto();
    // Call the method under test.
    s3Service.deleteDirectoryIgnoreException(s3FileTransferRequestParamsDto);
    // Verify the external calls.
    verify(s3Dao).deleteDirectory(s3FileTransferRequestParamsDto);
    verifyNoMoreInteractions(s3Dao);
}
Also used : S3FileTransferRequestParamsDto(org.finra.herd.model.dto.S3FileTransferRequestParamsDto) Test(org.junit.Test)

Aggregations

S3FileTransferRequestParamsDto (org.finra.herd.model.dto.S3FileTransferRequestParamsDto)160 Test (org.junit.Test)119 File (java.io.File)41 PutObjectRequest (com.amazonaws.services.s3.model.PutObjectRequest)31 ByteArrayInputStream (java.io.ByteArrayInputStream)30 BusinessObjectDataKey (org.finra.herd.model.api.xml.BusinessObjectDataKey)27 ObjectMetadata (com.amazonaws.services.s3.model.ObjectMetadata)23 ObjectNotFoundException (org.finra.herd.model.ObjectNotFoundException)23 S3FileTransferResultsDto (org.finra.herd.model.dto.S3FileTransferResultsDto)23 IOException (java.io.IOException)21 AmazonServiceException (com.amazonaws.AmazonServiceException)20 MultiObjectDeleteException (com.amazonaws.services.s3.model.MultiObjectDeleteException)20 InvocationOnMock (org.mockito.invocation.InvocationOnMock)20 AmazonClientException (com.amazonaws.AmazonClientException)19 AmazonS3Exception (com.amazonaws.services.s3.model.AmazonS3Exception)19 ArrayList (java.util.ArrayList)19 S3ObjectSummary (com.amazonaws.services.s3.model.S3ObjectSummary)18 Tag (com.amazonaws.services.s3.model.Tag)16 StorageFile (org.finra.herd.model.api.xml.StorageFile)15 AmazonS3Client (com.amazonaws.services.s3.AmazonS3Client)14