Search in sources :

Example 1 with SingleSampleRunMetadata

use of com.hartwig.pipeline.metadata.SingleSampleRunMetadata in project pipeline5 by hartwigmedical.

the class StagedOutputPublisher method publish.

public void publish(final PipelineState state, final SomaticRunMetadata metadata) {
    if (state.status() != PipelineStatus.FAILED && run.isPresent()) {
        List<AddDatatype> addDatatypes = state.stageOutputs().stream().map(StageOutput::datatypes).flatMap(List::stream).collect(Collectors.toList());
        SampleSet set = setResolver.resolve(metadata.set(), useOnlyDBSets);
        Optional<String> tumorSampleName = metadata.maybeTumor().map(SingleSampleRunMetadata::sampleName);
        Optional<String> refSampleName = metadata.maybeReference().map(SingleSampleRunMetadata::sampleName);
        ImmutableAnalysis.Builder alignedReadsAnalysis = eventBuilder(Type.ALIGNMENT);
        ImmutableAnalysis.Builder somaticAnalysis = eventBuilder(Type.SOMATIC);
        ImmutableAnalysis.Builder germlineAnalysis = eventBuilder(Type.GERMLINE);
        OutputIterator.from(blob -> {
            Optional<AddDatatype> dataType = addDatatypes.stream().filter(d -> blob.getName().endsWith(d.path())).findFirst();
            Blob blobWithMd5 = sourceBucket.get(blob.getName());
            if (isSecondary(blobWithMd5)) {
                alignedReadsAnalysis.addOutput(createBlob(tumorSampleName, refSampleName, dataType, blobWithMd5));
            } else {
                if (isGermline(blobWithMd5)) {
                    germlineAnalysis.addOutput(createBlob(tumorSampleName, refSampleName, dataType, blobWithMd5));
                } else if (notSecondary(blobWithMd5)) {
                    somaticAnalysis.addOutput(createBlob(tumorSampleName, refSampleName, dataType, blobWithMd5));
                }
            }
        }, sourceBucket).iterate(metadata);
        publish(PipelineComplete.builder().pipeline(ImmutablePipeline.builder().sample(tumorSampleName.orElseGet(() -> refSampleName.orElseThrow())).bucket(sourceBucket.getName()).runId(run.get().getId()).setId(set.getId()).context(context).addAnalyses(alignedReadsAnalysis.build(), somaticAnalysis.build(), germlineAnalysis.build()).version(Versions.pipelineMajorMinorVersion()).build()).build());
    }
}
Also used : StageOutput(com.hartwig.pipeline.StageOutput) Analysis(com.hartwig.events.Analysis) ImmutableAnalysisOutputBlob(com.hartwig.events.ImmutableAnalysisOutputBlob) Arrays(java.util.Arrays) ImmutableAnalysis(com.hartwig.events.ImmutableAnalysis) MD5s(com.hartwig.pipeline.metadata.MD5s) SageConfiguration(com.hartwig.pipeline.calling.sage.SageConfiguration) Aligner(com.hartwig.pipeline.alignment.Aligner) SnpGenotype(com.hartwig.pipeline.snpgenotype.SnpGenotype) Versions(com.hartwig.pipeline.tools.Versions) OutputIterator(com.hartwig.pipeline.transfer.OutputIterator) BamMetrics(com.hartwig.pipeline.metrics.BamMetrics) Blob(com.google.cloud.storage.Blob) Type(com.hartwig.events.Analysis.Type) Publisher(com.google.cloud.pubsub.v1.Publisher) PipelineStatus(com.hartwig.pipeline.execution.PipelineStatus) Molecule(com.hartwig.events.Analysis.Molecule) Bucket(com.google.cloud.storage.Bucket) Run(com.hartwig.api.model.Run) Pipeline(com.hartwig.events.Pipeline) PipelineState(com.hartwig.pipeline.PipelineState) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) PipelineComplete(com.hartwig.events.PipelineComplete) StageOutput(com.hartwig.pipeline.StageOutput) Collectors(java.util.stream.Collectors) SampleSet(com.hartwig.api.model.SampleSet) Flagstat(com.hartwig.pipeline.flagstat.Flagstat) List(java.util.List) AddDatatype(com.hartwig.pipeline.metadata.AddDatatype) AnalysisOutputBlob(com.hartwig.events.AnalysisOutputBlob) SomaticRunMetadata(com.hartwig.pipeline.metadata.SomaticRunMetadata) CramConversion(com.hartwig.pipeline.cram.CramConversion) SingleSampleRunMetadata(com.hartwig.pipeline.metadata.SingleSampleRunMetadata) GermlineCaller(com.hartwig.pipeline.calling.germline.GermlineCaller) Optional(java.util.Optional) ImmutablePipeline(com.hartwig.events.ImmutablePipeline) Predicate.not(java.util.function.Predicate.not) NotNull(org.jetbrains.annotations.NotNull) SampleSet(com.hartwig.api.model.SampleSet) ImmutableAnalysisOutputBlob(com.hartwig.events.ImmutableAnalysisOutputBlob) Blob(com.google.cloud.storage.Blob) AnalysisOutputBlob(com.hartwig.events.AnalysisOutputBlob) Optional(java.util.Optional) SingleSampleRunMetadata(com.hartwig.pipeline.metadata.SingleSampleRunMetadata) ImmutableAnalysis(com.hartwig.events.ImmutableAnalysis) AddDatatype(com.hartwig.pipeline.metadata.AddDatatype)

Example 2 with SingleSampleRunMetadata

use of com.hartwig.pipeline.metadata.SingleSampleRunMetadata in project pipeline5 by hartwigmedical.

the class BwaAligner method run.

public AlignmentOutput run(final SingleSampleRunMetadata metadata) throws Exception {
    StageTrace trace = new StageTrace(NAMESPACE, metadata.sampleName(), StageTrace.ExecutorType.COMPUTE_ENGINE).start();
    RuntimeBucket rootBucket = RuntimeBucket.from(storage, NAMESPACE, metadata, arguments, labels);
    Sample sample = sampleSource.sample(metadata);
    if (sample.bam().isPresent()) {
        String noPrefix = sample.bam().orElseThrow().replace("gs://", "");
        int firstSlash = noPrefix.indexOf("/");
        String bucket = noPrefix.substring(0, firstSlash);
        String path = noPrefix.substring(firstSlash + 1);
        return AlignmentOutput.builder().sample(metadata.sampleName()).status(PipelineStatus.PROVIDED).maybeAlignments(GoogleStorageLocation.of(bucket, path)).build();
    }
    final ResourceFiles resourceFiles = buildResourceFiles(arguments);
    sampleUpload.run(sample, rootBucket);
    List<Future<PipelineStatus>> futures = new ArrayList<>();
    List<GoogleStorageLocation> perLaneBams = new ArrayList<>();
    List<ReportComponent> laneLogComponents = new ArrayList<>();
    List<GoogleStorageLocation> laneFailedLogs = new ArrayList<>();
    for (Lane lane : sample.lanes()) {
        RuntimeBucket laneBucket = RuntimeBucket.from(storage, laneNamespace(lane), metadata, arguments, labels);
        BashStartupScript bash = BashStartupScript.of(laneBucket.name());
        InputDownload first = new InputDownload(GoogleStorageLocation.of(rootBucket.name(), fastQFileName(sample.name(), lane.firstOfPairPath())));
        InputDownload second = new InputDownload(GoogleStorageLocation.of(rootBucket.name(), fastQFileName(sample.name(), lane.secondOfPairPath())));
        bash.addCommand(first).addCommand(second);
        bash.addCommands(OverrideReferenceGenomeCommand.overrides(arguments));
        SubStageInputOutput alignment = new LaneAlignment(arguments.sbpApiRunId().isPresent(), resourceFiles.refGenomeFile(), first.getLocalTargetPath(), second.getLocalTargetPath(), metadata.sampleName(), lane).apply(SubStageInputOutput.empty(metadata.sampleName()));
        perLaneBams.add(GoogleStorageLocation.of(laneBucket.name(), resultsDirectory.path(alignment.outputFile().fileName())));
        bash.addCommands(alignment.bash()).addCommand(new OutputUpload(GoogleStorageLocation.of(laneBucket.name(), resultsDirectory.path()), RuntimeFiles.typical()));
        futures.add(executorService.submit(() -> runWithRetries(metadata, laneBucket, VirtualMachineJobDefinition.alignment(laneId(lane).toLowerCase(), bash, resultsDirectory))));
        laneLogComponents.add(new RunLogComponent(laneBucket, laneNamespace(lane), Folder.from(metadata), resultsDirectory));
        laneFailedLogs.add(GoogleStorageLocation.of(laneBucket.name(), RunLogComponent.LOG_FILE));
    }
    AlignmentOutput output;
    if (lanesSuccessfullyComplete(futures)) {
        List<InputDownload> laneBams = perLaneBams.stream().map(InputDownload::new).collect(Collectors.toList());
        BashStartupScript mergeMarkdupsBash = BashStartupScript.of(rootBucket.name());
        laneBams.forEach(mergeMarkdupsBash::addCommand);
        SubStageInputOutput merged = new MergeMarkDups(laneBams.stream().map(InputDownload::getLocalTargetPath).filter(path -> path.endsWith("bam")).collect(Collectors.toList())).apply(SubStageInputOutput.empty(metadata.sampleName()));
        mergeMarkdupsBash.addCommands(merged.bash());
        mergeMarkdupsBash.addCommand(new OutputUpload(GoogleStorageLocation.of(rootBucket.name(), resultsDirectory.path()), RuntimeFiles.typical()));
        PipelineStatus status = runWithRetries(metadata, rootBucket, VirtualMachineJobDefinition.mergeMarkdups(mergeMarkdupsBash, resultsDirectory));
        ImmutableAlignmentOutput.Builder outputBuilder = AlignmentOutput.builder().sample(metadata.sampleName()).status(status).maybeAlignments(GoogleStorageLocation.of(rootBucket.name(), resultsDirectory.path(merged.outputFile().fileName()))).addAllReportComponents(laneLogComponents).addAllFailedLogLocations(laneFailedLogs).addFailedLogLocations(GoogleStorageLocation.of(rootBucket.name(), RunLogComponent.LOG_FILE)).addReportComponents(new RunLogComponent(rootBucket, Aligner.NAMESPACE, Folder.from(metadata), resultsDirectory));
        if (!arguments.outputCram()) {
            outputBuilder.addReportComponents(new SingleFileComponent(rootBucket, Aligner.NAMESPACE, Folder.from(metadata), bam(metadata.sampleName()), bam(metadata.sampleName()), resultsDirectory), new SingleFileComponent(rootBucket, Aligner.NAMESPACE, Folder.from(metadata), bai(bam(metadata.sampleName())), bai(bam(metadata.sampleName())), resultsDirectory)).addDatatypes(new AddDatatype(DataType.ALIGNED_READS, metadata.barcode(), new ArchivePath(Folder.from(metadata), BwaAligner.NAMESPACE, bam(metadata.sampleName()))), new AddDatatype(DataType.ALIGNED_READS_INDEX, metadata.barcode(), new ArchivePath(Folder.from(metadata), BwaAligner.NAMESPACE, bai(metadata.sampleName()))));
        }
        output = outputBuilder.build();
    } else {
        output = AlignmentOutput.builder().sample(metadata.sampleName()).status(PipelineStatus.FAILED).build();
    }
    trace.stop();
    executorService.shutdown();
    return output;
}
Also used : Arguments(com.hartwig.pipeline.Arguments) StageTrace(com.hartwig.pipeline.trace.StageTrace) SubStageInputOutput(com.hartwig.pipeline.stages.SubStageInputOutput) Aligner(com.hartwig.pipeline.alignment.Aligner) InputDownload(com.hartwig.pipeline.execution.vm.InputDownload) ArrayList(java.util.ArrayList) VirtualMachineJobDefinition(com.hartwig.pipeline.execution.vm.VirtualMachineJobDefinition) Future(java.util.concurrent.Future) RuntimeBucket(com.hartwig.pipeline.storage.RuntimeBucket) PipelineStatus(com.hartwig.pipeline.execution.PipelineStatus) ExecutorService(java.util.concurrent.ExecutorService) BashStartupScript(com.hartwig.pipeline.execution.vm.BashStartupScript) DataType(com.hartwig.pipeline.datatypes.DataType) FileTypes.bai(com.hartwig.pipeline.datatypes.FileTypes.bai) ImmutableAlignmentOutput(com.hartwig.pipeline.alignment.ImmutableAlignmentOutput) GoogleStorageLocation(com.hartwig.pipeline.storage.GoogleStorageLocation) Lane(com.hartwig.patient.Lane) ArchivePath(com.hartwig.pipeline.metadata.ArchivePath) SampleUpload(com.hartwig.pipeline.storage.SampleUpload) Folder(com.hartwig.pipeline.report.Folder) ResultsDirectory(com.hartwig.pipeline.ResultsDirectory) OutputUpload(com.hartwig.pipeline.execution.vm.OutputUpload) DefaultBackoffPolicy(com.hartwig.pipeline.failsafe.DefaultBackoffPolicy) Collectors(java.util.stream.Collectors) String.format(java.lang.String.format) File(java.io.File) SingleFileComponent(com.hartwig.pipeline.report.SingleFileComponent) Failsafe(net.jodah.failsafe.Failsafe) ResourceFilesFactory.buildResourceFiles(com.hartwig.pipeline.resource.ResourceFilesFactory.buildResourceFiles) ExecutionException(java.util.concurrent.ExecutionException) List(java.util.List) Sample(com.hartwig.patient.Sample) AddDatatype(com.hartwig.pipeline.metadata.AddDatatype) AlignmentOutput(com.hartwig.pipeline.alignment.AlignmentOutput) OverrideReferenceGenomeCommand(com.hartwig.pipeline.resource.OverrideReferenceGenomeCommand) RuntimeFiles(com.hartwig.pipeline.execution.vm.RuntimeFiles) SingleSampleRunMetadata(com.hartwig.pipeline.metadata.SingleSampleRunMetadata) Storage(com.google.cloud.storage.Storage) Labels(com.hartwig.pipeline.labels.Labels) FileTypes.bam(com.hartwig.pipeline.datatypes.FileTypes.bam) ResourceFiles(com.hartwig.pipeline.resource.ResourceFiles) ComputeEngine(com.hartwig.pipeline.execution.vm.ComputeEngine) ReportComponent(com.hartwig.pipeline.report.ReportComponent) RunLogComponent(com.hartwig.pipeline.report.RunLogComponent) SampleSource(com.hartwig.pipeline.alignment.sample.SampleSource) RunLogComponent(com.hartwig.pipeline.report.RunLogComponent) PipelineStatus(com.hartwig.pipeline.execution.PipelineStatus) ArrayList(java.util.ArrayList) ReportComponent(com.hartwig.pipeline.report.ReportComponent) AddDatatype(com.hartwig.pipeline.metadata.AddDatatype) ArchivePath(com.hartwig.pipeline.metadata.ArchivePath) BashStartupScript(com.hartwig.pipeline.execution.vm.BashStartupScript) InputDownload(com.hartwig.pipeline.execution.vm.InputDownload) SingleFileComponent(com.hartwig.pipeline.report.SingleFileComponent) Sample(com.hartwig.patient.Sample) Lane(com.hartwig.patient.Lane) SubStageInputOutput(com.hartwig.pipeline.stages.SubStageInputOutput) ImmutableAlignmentOutput(com.hartwig.pipeline.alignment.ImmutableAlignmentOutput) StageTrace(com.hartwig.pipeline.trace.StageTrace) ResourceFilesFactory.buildResourceFiles(com.hartwig.pipeline.resource.ResourceFilesFactory.buildResourceFiles) ResourceFiles(com.hartwig.pipeline.resource.ResourceFiles) OutputUpload(com.hartwig.pipeline.execution.vm.OutputUpload) ImmutableAlignmentOutput(com.hartwig.pipeline.alignment.ImmutableAlignmentOutput) AlignmentOutput(com.hartwig.pipeline.alignment.AlignmentOutput) RuntimeBucket(com.hartwig.pipeline.storage.RuntimeBucket) Future(java.util.concurrent.Future) GoogleStorageLocation(com.hartwig.pipeline.storage.GoogleStorageLocation)

Example 3 with SingleSampleRunMetadata

use of com.hartwig.pipeline.metadata.SingleSampleRunMetadata in project pipeline5 by hartwigmedical.

the class PipelineMain method start.

public PipelineState start(final Arguments arguments) {
    LOGGER.info("Arguments are [{}]", arguments);
    Versions.printAll();
    try {
        GoogleCredentials credentials = CredentialProvider.from(arguments).get();
        Storage storage = StorageProvider.from(arguments, credentials).get();
        Publisher turquoisePublisher = PublisherProvider.from(arguments, credentials).get("turquoise.events");
        Publisher pipelinePublisher = PublisherProvider.from(arguments, credentials).get(PipelineComplete.TOPIC);
        SomaticMetadataApi somaticMetadataApi = SomaticMetadataApiProvider.from(arguments, storage, pipelinePublisher).get();
        SingleSampleEventListener referenceEventListener = new SingleSampleEventListener();
        SingleSampleEventListener tumorEventListener = new SingleSampleEventListener();
        SomaticRunMetadata somaticRunMetadata = somaticMetadataApi.get();
        InputMode mode = new ModeResolver().apply(somaticRunMetadata);
        LOGGER.info("Starting pipeline in [{}] mode", mode);
        String ini = somaticRunMetadata.isSingleSample() ? "single_sample" : arguments.shallow() ? "shallow" : "somatic";
        PipelineProperties eventSubjects = PipelineProperties.builder().sample(somaticRunMetadata.maybeTumor().map(SingleSampleRunMetadata::sampleName).orElseGet(() -> somaticRunMetadata.reference().sampleName())).runId(arguments.sbpApiRunId()).set(somaticRunMetadata.set()).referenceBarcode(somaticRunMetadata.maybeReference().map(SingleSampleRunMetadata::barcode)).tumorBarcode(somaticRunMetadata.maybeTumor().map(SingleSampleRunMetadata::barcode)).type(ini).build();
        somaticMetadataApi.start();
        startedEvent(eventSubjects, turquoisePublisher, arguments.publishToTurquoise());
        BlockingQueue<BamMetricsOutput> referenceBamMetricsOutputQueue = new ArrayBlockingQueue<>(1);
        BlockingQueue<BamMetricsOutput> tumorBamMetricsOutputQueue = new ArrayBlockingQueue<>(1);
        BlockingQueue<FlagstatOutput> referenceFlagstatOutputQueue = new ArrayBlockingQueue<>(1);
        BlockingQueue<FlagstatOutput> tumorFlagstatOutputQueue = new ArrayBlockingQueue<>(1);
        BlockingQueue<GermlineCallerOutput> germlineCallerOutputQueue = new ArrayBlockingQueue<>(1);
        StartingPoint startingPoint = new StartingPoint(arguments);
        PersistedDataset persistedDataset = arguments.biopsy().<PersistedDataset>map(b -> new ApiPersistedDataset(SbpRestApi.newInstance(arguments.sbpApiUrl()), ObjectMappers.get(), b, arguments.project())).orElse(new NoopPersistedDataset());
        PipelineState state = new FullPipeline(singleSamplePipeline(arguments, credentials, storage, referenceEventListener, somaticRunMetadata, referenceBamMetricsOutputQueue, germlineCallerOutputQueue, referenceFlagstatOutputQueue, startingPoint, persistedDataset, mode), singleSamplePipeline(arguments, credentials, storage, tumorEventListener, somaticRunMetadata, tumorBamMetricsOutputQueue, germlineCallerOutputQueue, tumorFlagstatOutputQueue, startingPoint, persistedDataset, mode), somaticPipeline(arguments, credentials, storage, somaticRunMetadata, referenceBamMetricsOutputQueue, tumorBamMetricsOutputQueue, referenceFlagstatOutputQueue, tumorFlagstatOutputQueue, startingPoint, persistedDataset, mode), Executors.newCachedThreadPool(), referenceEventListener, tumorEventListener, somaticMetadataApi, CleanupProvider.from(arguments, storage).get()).run();
        completedEvent(eventSubjects, turquoisePublisher, state.status().toString(), arguments.publishToTurquoise());
        VmExecutionLogSummary.ofFailedStages(storage, state);
        return state;
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : InputMode(com.hartwig.pipeline.metadata.InputMode) PublisherProvider(com.hartwig.pipeline.pubsub.PublisherProvider) PipelineProperties(com.hartwig.pipeline.turquoise.PipelineProperties) StorageProvider(com.hartwig.pipeline.storage.StorageProvider) SbpRestApi(com.hartwig.pipeline.sbpapi.SbpRestApi) LoggerFactory(org.slf4j.LoggerFactory) SomaticMetadataApiProvider(com.hartwig.pipeline.metadata.SomaticMetadataApiProvider) PipelineResultsProvider(com.hartwig.pipeline.report.PipelineResultsProvider) Versions(com.hartwig.pipeline.tools.Versions) PipelineCompleted(com.hartwig.pipeline.turquoise.PipelineCompleted) ApiPersistedDataset(com.hartwig.pipeline.reruns.ApiPersistedDataset) Publisher(com.google.cloud.pubsub.v1.Publisher) InputMode(com.hartwig.pipeline.metadata.InputMode) PipelineStatus(com.hartwig.pipeline.execution.PipelineStatus) CleanupProvider(com.hartwig.pipeline.cleanup.CleanupProvider) ModeResolver(com.hartwig.pipeline.metadata.ModeResolver) PersistedDataset(com.hartwig.pipeline.reruns.PersistedDataset) TurquoiseEvent(com.hartwig.pipeline.turquoise.TurquoiseEvent) SomaticMetadataApi(com.hartwig.pipeline.metadata.SomaticMetadataApi) Logger(org.slf4j.Logger) GoogleCredentials(com.google.auth.oauth2.GoogleCredentials) StageRunner(com.hartwig.pipeline.stages.StageRunner) StartingPoint(com.hartwig.pipeline.reruns.StartingPoint) FlagstatOutput(com.hartwig.pipeline.flagstat.FlagstatOutput) PipelineComplete(com.hartwig.events.PipelineComplete) GoogleComputeEngine(com.hartwig.pipeline.execution.vm.GoogleComputeEngine) BlockingQueue(java.util.concurrent.BlockingQueue) NoopPersistedDataset(com.hartwig.pipeline.reruns.NoopPersistedDataset) AlignerProvider(com.hartwig.pipeline.alignment.AlignerProvider) SingleSampleEventListener(com.hartwig.pipeline.metadata.SingleSampleEventListener) Executors(java.util.concurrent.Executors) PipelineStarted(com.hartwig.pipeline.turquoise.PipelineStarted) ArrayBlockingQueue(java.util.concurrent.ArrayBlockingQueue) GermlineCallerOutput(com.hartwig.pipeline.calling.germline.GermlineCallerOutput) BamMetricsOutput(com.hartwig.pipeline.metrics.BamMetricsOutput) SomaticRunMetadata(com.hartwig.pipeline.metadata.SomaticRunMetadata) VmExecutionLogSummary(com.hartwig.pipeline.report.VmExecutionLogSummary) ObjectMappers(com.hartwig.pipeline.jackson.ObjectMappers) ParseException(org.apache.commons.cli.ParseException) SingleSampleRunMetadata(com.hartwig.pipeline.metadata.SingleSampleRunMetadata) Storage(com.google.cloud.storage.Storage) CredentialProvider(com.hartwig.pipeline.credentials.CredentialProvider) Labels(com.hartwig.pipeline.labels.Labels) ApiPersistedDataset(com.hartwig.pipeline.reruns.ApiPersistedDataset) GermlineCallerOutput(com.hartwig.pipeline.calling.germline.GermlineCallerOutput) SingleSampleEventListener(com.hartwig.pipeline.metadata.SingleSampleEventListener) ArrayBlockingQueue(java.util.concurrent.ArrayBlockingQueue) SingleSampleRunMetadata(com.hartwig.pipeline.metadata.SingleSampleRunMetadata) GoogleCredentials(com.google.auth.oauth2.GoogleCredentials) FlagstatOutput(com.hartwig.pipeline.flagstat.FlagstatOutput) BamMetricsOutput(com.hartwig.pipeline.metrics.BamMetricsOutput) SomaticRunMetadata(com.hartwig.pipeline.metadata.SomaticRunMetadata) ModeResolver(com.hartwig.pipeline.metadata.ModeResolver) Publisher(com.google.cloud.pubsub.v1.Publisher) ParseException(org.apache.commons.cli.ParseException) StartingPoint(com.hartwig.pipeline.reruns.StartingPoint) Storage(com.google.cloud.storage.Storage) SomaticMetadataApi(com.hartwig.pipeline.metadata.SomaticMetadataApi) PipelineProperties(com.hartwig.pipeline.turquoise.PipelineProperties) ApiPersistedDataset(com.hartwig.pipeline.reruns.ApiPersistedDataset) PersistedDataset(com.hartwig.pipeline.reruns.PersistedDataset) NoopPersistedDataset(com.hartwig.pipeline.reruns.NoopPersistedDataset) NoopPersistedDataset(com.hartwig.pipeline.reruns.NoopPersistedDataset)

Example 4 with SingleSampleRunMetadata

use of com.hartwig.pipeline.metadata.SingleSampleRunMetadata in project pipeline5 by hartwigmedical.

the class TestInputs method defaultSomaticRunMetadata.

public static SomaticRunMetadata defaultSomaticRunMetadata() {
    final SingleSampleRunMetadata tumor = tumorRunMetadata();
    final SingleSampleRunMetadata reference = referenceRunMetadata();
    return SomaticRunMetadata.builder().set(SET).maybeTumor(tumor).maybeReference(reference).bucket(BUCKET).build();
}
Also used : SingleSampleRunMetadata(com.hartwig.pipeline.metadata.SingleSampleRunMetadata)

Aggregations

SingleSampleRunMetadata (com.hartwig.pipeline.metadata.SingleSampleRunMetadata)4 PipelineStatus (com.hartwig.pipeline.execution.PipelineStatus)3 Publisher (com.google.cloud.pubsub.v1.Publisher)2 Storage (com.google.cloud.storage.Storage)2 PipelineComplete (com.hartwig.events.PipelineComplete)2 Aligner (com.hartwig.pipeline.alignment.Aligner)2 AddDatatype (com.hartwig.pipeline.metadata.AddDatatype)2 List (java.util.List)2 Collectors (java.util.stream.Collectors)2 ObjectMapper (com.fasterxml.jackson.databind.ObjectMapper)1 GoogleCredentials (com.google.auth.oauth2.GoogleCredentials)1 Blob (com.google.cloud.storage.Blob)1 Bucket (com.google.cloud.storage.Bucket)1 Run (com.hartwig.api.model.Run)1 SampleSet (com.hartwig.api.model.SampleSet)1 Analysis (com.hartwig.events.Analysis)1 Molecule (com.hartwig.events.Analysis.Molecule)1 Type (com.hartwig.events.Analysis.Type)1 AnalysisOutputBlob (com.hartwig.events.AnalysisOutputBlob)1 ImmutableAnalysis (com.hartwig.events.ImmutableAnalysis)1