Search in sources :

Example 16 with Filter

use of com.amazonaws.services.s3.model.Filter in project esop by instaclustr.

the class BaseS3Restorer method listBucket.

private List<S3ObjectSummary> listBucket(final String remotePrefix, final Predicate<String> keyFilter) {
    ObjectListing objectListing = amazonS3.listObjects(request.storageLocation.bucket, remotePrefix);
    boolean hasMoreContent = true;
    final List<S3ObjectSummary> summaryList = new ArrayList<>();
    while (hasMoreContent) {
        objectListing.getObjectSummaries().stream().filter(// no dirs
        objectSummary -> !objectSummary.getKey().endsWith("/")).filter(file -> keyFilter.test(file.getKey())).collect(toCollection(() -> summaryList));
        if (objectListing.isTruncated()) {
            objectListing = amazonS3.listNextBatchOfObjects(objectListing);
        } else {
            hasMoreContent = false;
        }
    }
    return summaryList;
}
Also used : TransferManager(com.amazonaws.services.s3.transfer.TransferManager) ListOperationRequest(com.instaclustr.esop.impl.list.ListOperationRequest) LoggerFactory(org.slf4j.LoggerFactory) ListOperationRequest.getForLocalListing(com.instaclustr.esop.impl.list.ListOperationRequest.getForLocalListing) GetObjectRequest(com.amazonaws.services.s3.model.GetObjectRequest) ObjectListing(com.amazonaws.services.s3.model.ObjectListing) ProgressEvent(com.amazonaws.event.ProgressEvent) DeleteObjectRequest(com.amazonaws.services.s3.model.DeleteObjectRequest) ArrayList(java.util.ArrayList) Collectors.toCollection(java.util.stream.Collectors.toCollection) RemoveBackupRequest(com.instaclustr.esop.impl.remove.RemoveBackupRequest) S3ProgressListener(com.amazonaws.services.s3.transfer.internal.S3ProgressListener) LocalFileRestorer(com.instaclustr.esop.local.LocalFileRestorer) S3Object(com.amazonaws.services.s3.model.S3Object) CharStreams(com.google.common.io.CharStreams) DeleteObjectsRequest(com.amazonaws.services.s3.model.DeleteObjectsRequest) RemoteObjectReference(com.instaclustr.esop.impl.RemoteObjectReference) AmazonS3(com.amazonaws.services.s3.AmazonS3) ProgressEventType(com.amazonaws.event.ProgressEventType) S3ObjectSummary(com.amazonaws.services.s3.model.S3ObjectSummary) Path(java.nio.file.Path) FileUtils(com.instaclustr.io.FileUtils) RestoreCommitLogsOperationRequest(com.instaclustr.esop.impl.restore.RestoreCommitLogsOperationRequest) Logger(org.slf4j.Logger) AmazonServiceException(com.amazonaws.AmazonServiceException) PersistableTransfer(com.amazonaws.services.s3.transfer.PersistableTransfer) StorageLocation(com.instaclustr.esop.impl.StorageLocation) Restorer(com.instaclustr.esop.impl.restore.Restorer) Files(java.nio.file.Files) Predicate(java.util.function.Predicate) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) RetriableException(com.instaclustr.esop.impl.retry.Retrier.RetriableException) IOException(java.io.IOException) Manifest(com.instaclustr.esop.impl.Manifest) InputStreamReader(java.io.InputStreamReader) Collectors(java.util.stream.Collectors) String.format(java.lang.String.format) RetrierFactory(com.instaclustr.esop.impl.retry.RetrierFactory) Consumer(java.util.function.Consumer) List(java.util.List) Paths(java.nio.file.Paths) RestoreOperationRequest(com.instaclustr.esop.impl.restore.RestoreOperationRequest) AmazonClientException(com.amazonaws.AmazonClientException) InputStream(java.io.InputStream) ArrayList(java.util.ArrayList) ObjectListing(com.amazonaws.services.s3.model.ObjectListing) S3ObjectSummary(com.amazonaws.services.s3.model.S3ObjectSummary)

Example 17 with Filter

use of com.amazonaws.services.s3.model.Filter in project solarnetwork-central by SolarNetwork.

the class S3DatumExportDestinationServiceTests method settingSpecifiers.

@Test
public void settingSpecifiers() {
    // given
    S3DatumExportDestinationService service = new S3DatumExportDestinationService();
    // when
    List<SettingSpecifier> specs = service.getSettingSpecifiers();
    // then
    assertThat("Setting specs provided", specs, hasSize(5));
    Set<String> keys = specs.stream().filter(s -> s instanceof KeyedSettingSpecifier<?>).map(s -> ((KeyedSettingSpecifier<?>) s).getKey()).collect(Collectors.toSet());
    assertThat("Setting keys", keys, containsInAnyOrder("accessKey", "secretKey", "path", "filenameTemplate", "storageClass"));
}
Also used : AWSStaticCredentialsProvider(com.amazonaws.auth.AWSStaticCredentialsProvider) AmazonS3ClientBuilder(com.amazonaws.services.s3.AmazonS3ClientBuilder) DatumExportService(net.solarnetwork.central.datum.export.biz.DatumExportService) Matchers.not(org.hamcrest.Matchers.not) LoggerFactory(org.slf4j.LoggerFactory) BasicConfiguration(net.solarnetwork.central.datum.export.domain.BasicConfiguration) Map(java.util.Map) SettingSpecifier(net.solarnetwork.settings.SettingSpecifier) KeyVersion(com.amazonaws.services.s3.model.DeleteObjectsRequest.KeyVersion) Resource(org.springframework.core.io.Resource) BasicDatumExportResource(net.solarnetwork.central.datum.export.domain.BasicDatumExportResource) DeleteObjectsResult(com.amazonaws.services.s3.model.DeleteObjectsResult) S3DatumExportDestinationService(net.solarnetwork.central.datum.export.dest.s3.S3DatumExportDestinationService) Set(java.util.Set) Instant(java.time.Instant) KeyedSettingSpecifier(net.solarnetwork.settings.KeyedSettingSpecifier) Collectors(java.util.stream.Collectors) ZoneId(java.time.ZoneId) List(java.util.List) DatumExportResource(net.solarnetwork.central.datum.export.domain.DatumExportResource) Matchers.containsInAnyOrder(org.hamcrest.Matchers.containsInAnyOrder) FileCopyUtils(org.springframework.util.FileCopyUtils) AmazonS3URI(com.amazonaws.services.s3.AmazonS3URI) AbstractCentralTest(net.solarnetwork.central.test.AbstractCentralTest) BeforeClass(org.junit.BeforeClass) CoreMatchers.equalTo(org.hamcrest.CoreMatchers.equalTo) LocalDateTime(java.time.LocalDateTime) ClassPathResource(org.springframework.core.io.ClassPathResource) HashMap(java.util.HashMap) ArrayList(java.util.ArrayList) ListObjectsV2Result(com.amazonaws.services.s3.model.ListObjectsV2Result) ProgressListener(net.solarnetwork.service.ProgressListener) CsvDatumExportOutputFormatService(net.solarnetwork.central.datum.export.standard.CsvDatumExportOutputFormatService) S3Object(com.amazonaws.services.s3.model.S3Object) DeleteObjectsRequest(com.amazonaws.services.s3.model.DeleteObjectsRequest) Matchers.hasSize(org.hamcrest.Matchers.hasSize) AmazonS3(com.amazonaws.services.s3.AmazonS3) MatcherAssert.assertThat(org.hamcrest.MatcherAssert.assertThat) S3ObjectSummary(com.amazonaws.services.s3.model.S3ObjectSummary) BasicDestinationConfiguration(net.solarnetwork.central.datum.export.domain.BasicDestinationConfiguration) Properties(java.util.Properties) Logger(org.slf4j.Logger) BasicAWSCredentials(com.amazonaws.auth.BasicAWSCredentials) IOException(java.io.IOException) Test(org.junit.Test) InputStreamReader(java.io.InputStreamReader) Matchers.sameInstance(org.hamcrest.Matchers.sameInstance) Collections(java.util.Collections) InputStream(java.io.InputStream) SettingSpecifier(net.solarnetwork.settings.SettingSpecifier) KeyedSettingSpecifier(net.solarnetwork.settings.KeyedSettingSpecifier) S3DatumExportDestinationService(net.solarnetwork.central.datum.export.dest.s3.S3DatumExportDestinationService) KeyedSettingSpecifier(net.solarnetwork.settings.KeyedSettingSpecifier) AbstractCentralTest(net.solarnetwork.central.test.AbstractCentralTest) Test(org.junit.Test)

Example 18 with Filter

use of com.amazonaws.services.s3.model.Filter in project trellis-extensions by trellis-ldp.

the class S3MementoService method put.

@Override
public CompletionStage<Void> put(final Resource resource) {
    return runAsync(() -> {
        try {
            final File file = createTempFile("trellis-memento-", ".nq");
            file.deleteOnExit();
            final Map<String, String> metadata = new HashMap<>();
            metadata.put(S3Resource.INTERACTION_MODEL, resource.getInteractionModel().getIRIString());
            metadata.put(S3Resource.MODIFIED, resource.getModified().toString());
            resource.getContainer().map(IRI::getIRIString).ifPresent(c -> metadata.put(S3Resource.CONTAINER, c));
            resource.getBinaryMetadata().ifPresent(b -> {
                metadata.put(S3Resource.BINARY_LOCATION, b.getIdentifier().getIRIString());
                b.getMimeType().ifPresent(m -> metadata.put(S3Resource.BINARY_TYPE, m));
            });
            resource.getMembershipResource().map(IRI::getIRIString).ifPresent(m -> metadata.put(S3Resource.MEMBERSHIP_RESOURCE, m));
            resource.getMemberRelation().map(IRI::getIRIString).ifPresent(m -> metadata.put(S3Resource.MEMBER_RELATION, m));
            resource.getMemberOfRelation().map(IRI::getIRIString).ifPresent(m -> metadata.put(S3Resource.MEMBER_OF_RELATION, m));
            resource.getInsertedContentRelation().map(IRI::getIRIString).ifPresent(m -> metadata.put(S3Resource.INSERTED_CONTENT_RELATION, m));
            try (final Dataset dataset = rdf.createDataset();
                final OutputStream output = Files.newOutputStream(file.toPath());
                final Stream<? extends Quad> quads = resource.stream()) {
                quads.forEachOrdered(dataset::add);
                metadata.put(S3Resource.METADATA_GRAPHS, dataset.getGraphNames().filter(IRI.class::isInstance).map(IRI.class::cast).filter(graph -> !IGNORE.contains(graph)).map(IRI::getIRIString).collect(joining(",")));
                RDFDataMgr.write(output, toJena(dataset), NQUADS);
            }
            final ObjectMetadata md = new ObjectMetadata();
            md.setContentType("application/n-quads");
            md.setUserMetadata(metadata);
            final PutObjectRequest req = new PutObjectRequest(bucketName, getKey(resource.getIdentifier(), resource.getModified().truncatedTo(SECONDS)), file);
            client.putObject(req.withMetadata(md));
            Files.delete(file.toPath());
        } catch (final Exception ex) {
            throw new TrellisRuntimeException("Error deleting locally buffered file", ex);
        }
    });
}
Also used : TrellisRuntimeException(org.trellisldp.api.TrellisRuntimeException) SortedSet(java.util.SortedSet) File.createTempFile(java.io.File.createTempFile) Resource(org.trellisldp.api.Resource) ListObjectsV2Request(com.amazonaws.services.s3.model.ListObjectsV2Request) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) JenaCommonsRDF.toJena(org.apache.jena.commonsrdf.JenaCommonsRDF.toJena) Map(java.util.Map) Collections.unmodifiableSortedSet(java.util.Collections.unmodifiableSortedSet) AmazonS3ClientBuilder.defaultClient(com.amazonaws.services.s3.AmazonS3ClientBuilder.defaultClient) NQUADS(org.apache.jena.riot.Lang.NQUADS) Set(java.util.Set) ConfigProvider.getConfig(org.eclipse.microprofile.config.ConfigProvider.getConfig) Config(org.eclipse.microprofile.config.Config) Instant(java.time.Instant) Collectors.joining(java.util.stream.Collectors.joining) CompletionStage(java.util.concurrent.CompletionStage) Stream(java.util.stream.Stream) MementoService(org.trellisldp.api.MementoService) TRELLIS_DATA_PREFIX(org.trellisldp.api.TrellisUtils.TRELLIS_DATA_PREFIX) Alternative(javax.enterprise.inject.Alternative) HashMap(java.util.HashMap) GetObjectRequest(com.amazonaws.services.s3.model.GetObjectRequest) TreeSet(java.util.TreeSet) RDF(org.apache.commons.rdf.api.RDF) ListObjectsV2Result(com.amazonaws.services.s3.model.ListObjectsV2Result) HashSet(java.util.HashSet) Inject(javax.inject.Inject) Objects.requireNonNull(java.util.Objects.requireNonNull) CompletableFuture.supplyAsync(java.util.concurrent.CompletableFuture.supplyAsync) AmazonS3(com.amazonaws.services.s3.AmazonS3) LDP(org.trellisldp.vocabulary.LDP) S3ObjectSummary(com.amazonaws.services.s3.model.S3ObjectSummary) Dataset(org.apache.commons.rdf.api.Dataset) Trellis(org.trellisldp.vocabulary.Trellis) OutputStream(java.io.OutputStream) SECONDS(java.time.temporal.ChronoUnit.SECONDS) RDFFactory(org.trellisldp.api.RDFFactory) CompletableFuture.runAsync(java.util.concurrent.CompletableFuture.runAsync) Logger(org.slf4j.Logger) Files(java.nio.file.Files) Stream.of(java.util.stream.Stream.of) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest) File(java.io.File) IRI(org.apache.commons.rdf.api.IRI) Collections.unmodifiableSet(java.util.Collections.unmodifiableSet) Quad(org.apache.commons.rdf.api.Quad) RDFDataMgr(org.apache.jena.riot.RDFDataMgr) LoggerFactory.getLogger(org.slf4j.LoggerFactory.getLogger) MISSING_RESOURCE(org.trellisldp.api.Resource.SpecialResources.MISSING_RESOURCE) IRI(org.apache.commons.rdf.api.IRI) HashMap(java.util.HashMap) Dataset(org.apache.commons.rdf.api.Dataset) OutputStream(java.io.OutputStream) TrellisRuntimeException(org.trellisldp.api.TrellisRuntimeException) TrellisRuntimeException(org.trellisldp.api.TrellisRuntimeException) File.createTempFile(java.io.File.createTempFile) File(java.io.File) ObjectMetadata(com.amazonaws.services.s3.model.ObjectMetadata) PutObjectRequest(com.amazonaws.services.s3.model.PutObjectRequest)

Example 19 with Filter

use of com.amazonaws.services.s3.model.Filter in project nrtsearch by Yelp.

the class BackupRestoreIndexRequestHandlerTest method getFiles.

public List<String> getFiles(Path basePath) {
    List<String> result = new ArrayList<>();
    ImmutableList<File> childFiles = FileUtils.listFiles(basePath.toFile());
    for (File childFile : childFiles) {
        if (Files.isDirectory(childFile.toPath())) {
            result.addAll(getFiles(childFile.toPath()));
        } else if (Files.isRegularFile(childFile.toPath())) {
            result.add(childFile.getName());
        }
    }
    return result.stream().filter(x -> !x.startsWith("snapshots") && !x.startsWith("stateRefCounts")).collect(Collectors.toList());
}
Also used : Tar(com.yelp.nrtsearch.server.backup.Tar) Arrays(java.util.Arrays) AnonymousAWSCredentials(com.amazonaws.auth.AnonymousAWSCredentials) GlobalState(com.yelp.nrtsearch.server.luceneserver.GlobalState) FileUtils(org.iq80.leveldb.util.FileUtils) ArrayList(java.util.ArrayList) TarImpl(com.yelp.nrtsearch.server.backup.TarImpl) LuceneServerTest.checkHits(com.yelp.nrtsearch.server.grpc.LuceneServerTest.checkHits) ArchiverImpl(com.yelp.nrtsearch.server.backup.ArchiverImpl) RETRIEVED_VALUES(com.yelp.nrtsearch.server.grpc.LuceneServerTest.RETRIEVED_VALUES) ImmutableList(com.google.common.collect.ImmutableList) LuceneServerTestConfigurationFactory(com.yelp.nrtsearch.server.LuceneServerTestConfigurationFactory) S3Mock(io.findify.s3mock.S3Mock) After(org.junit.After) AmazonS3(com.amazonaws.services.s3.AmazonS3) Path(java.nio.file.Path) Before(org.junit.Before) LuceneServerConfiguration(com.yelp.nrtsearch.server.config.LuceneServerConfiguration) Files(java.nio.file.Files) Assert.assertTrue(org.junit.Assert.assertTrue) IOException(java.io.IOException) Test(org.junit.Test) AmazonS3Client(com.amazonaws.services.s3.AmazonS3Client) Collectors(java.util.stream.Collectors) File(java.io.File) StatusRuntimeException(io.grpc.StatusRuntimeException) GrpcServer.rmDir(com.yelp.nrtsearch.server.grpc.GrpcServer.rmDir) List(java.util.List) Rule(org.junit.Rule) GrpcCleanupRule(io.grpc.testing.GrpcCleanupRule) Paths(java.nio.file.Paths) Archiver(com.yelp.nrtsearch.server.backup.Archiver) Assert.assertEquals(org.junit.Assert.assertEquals) TemporaryFolder(org.junit.rules.TemporaryFolder) ArrayList(java.util.ArrayList) File(java.io.File)

Example 20 with Filter

use of com.amazonaws.services.s3.model.Filter in project datapull by homeaway.

the class DataPullRequestProcessor method runDataPull.

private void runDataPull(String json, boolean isStart, boolean validateJson) throws ProcessingException {
    String originalInputJson = json;
    json = extractUserJsonFromS3IfProvided(json, isStart);
    final EMRProperties emrProperties = this.config.getEmrProperties();
    if (log.isDebugEnabled())
        log.debug("runDataPull -> json = " + json + " isStart = " + isStart);
    try {
        if (validateJson) {
            json = validateAndEnrich(json);
        }
        log.info("Running datapull for json : " + json + " cron expression = " + isStart + "env =" + env);
        final ObjectNode node = new ObjectMapper().readValue(json, ObjectNode.class);
        List<Map.Entry<String, JsonNode>> result = new LinkedList<Map.Entry<String, JsonNode>>();
        Iterator<Map.Entry<String, JsonNode>> nodes = node.fields();
        while (nodes.hasNext()) {
            result.add(nodes.next());
        }
        JsonNode clusterNode = result.stream().filter(y -> y.getKey().equalsIgnoreCase("cluster")).map(x -> x.getValue()).findAny().get();
        JsonNode migrationsNode = result.stream().filter(y -> y.getKey().equalsIgnoreCase("migrations")).map(x -> x.getValue()).findAny().get();
        if (clusterNode == null)
            throw new ProcessingException("Invalid Json!!! Cluster properties cannot be null");
        String creator = node.has(CREATOR) ? node.findValue(CREATOR).asText() : "";
        ObjectMapper mapper = new ObjectMapper();
        ClusterProperties reader = mapper.treeToValue(clusterNode, ClusterProperties.class);
        Migration[] myObjects = mapper.treeToValue(migrationsNode, Migration[].class);
        String cronExp = Objects.toString(reader.getCronExpression(), "");
        if (!cronExp.isEmpty())
            cronExp = validateAndProcessCronExpression(cronExp);
        String pipeline = Objects.toString(reader.getPipelineName(), UUID.randomUUID().toString());
        String pipelineEnv = Objects.toString(reader.getAwsEnv(), env);
        DataPullProperties dataPullProperties = config.getDataPullProperties();
        String applicationHistoryFolder = dataPullProperties.getApplicationHistoryFolder();
        String s3RepositoryBucketName = dataPullProperties.getS3BucketName();
        String jobName = pipelineEnv + PIPELINE_NAME_DELIMITER + EMR + PIPELINE_NAME_DELIMITER + pipeline + PIPELINE_NAME_DELIMITER + PIPELINE_NAME_SUFFIX;
        String applicationHistoryFolderPath = applicationHistoryFolder == null || applicationHistoryFolder.isEmpty() ? s3RepositoryBucketName + "/" + DATAPULL_HISTORY_FOLDER : applicationHistoryFolder;
        String bootstrapFilePath = s3RepositoryBucketName + "/" + BOOTSTRAP_FOLDER;
        String filePath = applicationHistoryFolderPath + "/" + jobName;
        String bootstrapFile = jobName + ".sh";
        String jksFilePath = bootstrapFilePath + "/" + bootstrapFile;
        String bootstrapActionStringFromUser = Objects.toString(reader.getBootstrapactionstring(), "");
        String defaultBootstrapString = emrProperties.getDefaultBootstrapString();
        Boolean haveBootstrapAction = createBootstrapScript(myObjects, bootstrapFile, bootstrapFilePath, bootstrapActionStringFromUser, defaultBootstrapString);
        DataPullTask task = createDataPullTask(filePath, jksFilePath, reader, jobName, creator, node.path("sparkjarfile").asText(), haveBootstrapAction);
        if (!isStart) {
            json = originalInputJson.equals(json) ? json : originalInputJson;
            saveConfig(applicationHistoryFolderPath, jobName + ".json", json);
        }
        if (!isStart && tasksMap.containsKey(jobName))
            cancelExistingTask(jobName);
        if (!(isStart && cronExp.isEmpty())) {
            Future<?> future = !cronExp.isEmpty() ? scheduler.schedule(task, new CronTrigger(cronExp)) : scheduler.schedule(task, new Date(System.currentTimeMillis() + 1 * 1000));
            tasksMap.put(jobName, future);
        }
    } catch (IOException e) {
        throw new ProcessingException("exception while starting datapull " + e.getLocalizedMessage());
    }
    if (log.isDebugEnabled())
        log.debug("runDataPull <- return");
}
Also used : java.util(java.util) DataPullClientService(com.homeaway.datapullclient.service.DataPullClientService) DataPullContextHolder(com.homeaway.datapullclient.config.DataPullContextHolder) Autowired(org.springframework.beans.factory.annotation.Autowired) DataPullProperties(com.homeaway.datapullclient.config.DataPullProperties) ProcessingException(com.homeaway.datapullclient.exception.ProcessingException) ObjectNode(com.fasterxml.jackson.databind.node.ObjectNode) EMRProperties(com.homeaway.datapullclient.config.EMRProperties) Value(org.springframework.beans.factory.annotation.Value) PathMatchingResourcePatternResolver(org.springframework.core.io.support.PathMatchingResourcePatternResolver) Future(java.util.concurrent.Future) JSONObject(org.json.JSONObject) ByteArrayInputStream(java.io.ByteArrayInputStream) JsonInputFile(com.homeaway.datapullclient.input.JsonInputFile) Service(org.springframework.stereotype.Service) AmazonS3(com.amazonaws.services.s3.AmazonS3) JsonNode(com.fasterxml.jackson.databind.JsonNode) ThreadPoolTaskScheduler(org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler) SchemaLoader(org.everit.json.schema.loader.SchemaLoader) Resource(org.springframework.core.io.Resource) Migration(com.homeaway.datapullclient.input.Migration) ValidationException(org.everit.json.schema.ValidationException) Source(com.homeaway.datapullclient.input.Source) DataPullContext(com.homeaway.datapullclient.config.DataPullContext) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) JSONTokener(org.json.JSONTokener) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) IOException(java.io.IOException) com.amazonaws.services.s3.model(com.amazonaws.services.s3.model) ClusterProperties(com.homeaway.datapullclient.input.ClusterProperties) InputStreamReader(java.io.InputStreamReader) Collectors(java.util.stream.Collectors) StandardCharsets(java.nio.charset.StandardCharsets) CronTrigger(org.springframework.scheduling.support.CronTrigger) InvalidPointedJsonException(com.homeaway.datapullclient.exception.InvalidPointedJsonException) Slf4j(lombok.extern.slf4j.Slf4j) PostConstruct(javax.annotation.PostConstruct) DataPullClientConfig(com.homeaway.datapullclient.config.DataPullClientConfig) Schema(org.everit.json.schema.Schema) ResourcePatternResolver(org.springframework.core.io.support.ResourcePatternResolver) BufferedReader(java.io.BufferedReader) CronTrigger(org.springframework.scheduling.support.CronTrigger) ObjectNode(com.fasterxml.jackson.databind.node.ObjectNode) Migration(com.homeaway.datapullclient.input.Migration) JsonNode(com.fasterxml.jackson.databind.JsonNode) IOException(java.io.IOException) DataPullProperties(com.homeaway.datapullclient.config.DataPullProperties) ClusterProperties(com.homeaway.datapullclient.input.ClusterProperties) EMRProperties(com.homeaway.datapullclient.config.EMRProperties) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) ProcessingException(com.homeaway.datapullclient.exception.ProcessingException)

Aggregations

Filter (com.amazonaws.services.ec2.model.Filter)96 ArrayList (java.util.ArrayList)70 List (java.util.List)52 Collectors (java.util.stream.Collectors)46 IOException (java.io.IOException)41 HashMap (java.util.HashMap)38 Map (java.util.Map)35 AmazonS3 (com.amazonaws.services.s3.AmazonS3)34 Set (java.util.Set)31 DescribeInstancesRequest (com.amazonaws.services.ec2.model.DescribeInstancesRequest)30 S3ObjectSummary (com.amazonaws.services.s3.model.S3ObjectSummary)27 Instance (com.amazonaws.services.ec2.model.Instance)26 HashSet (java.util.HashSet)26 Reservation (com.amazonaws.services.ec2.model.Reservation)24 Collections (java.util.Collections)23 DescribeInstancesResult (com.amazonaws.services.ec2.model.DescribeInstancesResult)21 ObjectListing (com.amazonaws.services.s3.model.ObjectListing)21 DescribeSubnetsRequest (com.amazonaws.services.ec2.model.DescribeSubnetsRequest)20 Entry (java.util.Map.Entry)20 Tag (com.amazonaws.services.ec2.model.Tag)18