Search in sources :

Example 56 with FINAL

use of org.hl7.fhir.r4.model.Observation.ObservationStatus.FINAL in project geoprism-registry by terraframe.

the class FhirExportSynchronizationManager method writeToFile.

public File writeToFile() throws IOException {
    final FhirExternalSystem system = (FhirExternalSystem) this.config.getSystem();
    try (FhirConnection connection = FhirConnectionFactory.get(system)) {
        String name = SessionPredicate.generateId();
        File root = new File(new File(VaultProperties.getPath("vault.default"), "files"), name);
        root.mkdirs();
        Bundle bundle = this.generateBundle(connection);
        FhirContext ctx = FhirContext.forR4();
        IParser parser = ctx.newJsonParser();
        try {
            parser.encodeResourceToWriter(bundle, new FileWriter(new File(root, "bundle.json")));
        } catch (DataFormatException | IOException e) {
            throw new ProgrammingErrorException(e);
        }
        return root;
    } catch (Exception e) {
        throw new HttpError(e);
    }
}
Also used : FhirContext(ca.uhn.fhir.context.FhirContext) Bundle(org.hl7.fhir.r4.model.Bundle) FileWriter(java.io.FileWriter) IOException(java.io.IOException) FhirExternalSystem(net.geoprism.registry.graph.FhirExternalSystem) ProgrammingErrorException(com.runwaysdk.dataaccess.ProgrammingErrorException) ProgrammingErrorException(com.runwaysdk.dataaccess.ProgrammingErrorException) DataFormatException(ca.uhn.fhir.parser.DataFormatException) IOException(java.io.IOException) DataFormatException(ca.uhn.fhir.parser.DataFormatException) HttpError(net.geoprism.registry.etl.export.HttpError) File(java.io.File) IParser(ca.uhn.fhir.parser.IParser)

Example 57 with FINAL

use of org.hl7.fhir.r4.model.Observation.ObservationStatus.FINAL in project pathling by aehrc.

the class ReverseResolveFunction method invoke.

@Nonnull
@Override
public FhirPath invoke(@Nonnull final NamedFunctionInput input) {
    checkUserInput(input.getInput() instanceof ResourcePath, "Input to " + NAME + " function must be a resource: " + input.getInput().getExpression());
    final ResourcePath inputPath = (ResourcePath) input.getInput();
    final String expression = NamedFunction.expressionFromInput(input, NAME);
    checkUserInput(input.getArguments().size() == 1, "reverseResolve function accepts a single argument: " + expression);
    final FhirPath argument = input.getArguments().get(0);
    checkUserInput(argument instanceof ReferencePath, "Argument to reverseResolve function must be a Reference: " + argument.getExpression());
    final ReferencePath referencePath = (ReferencePath) argument;
    // Check that the input type is one of the possible types specified by the argument.
    final Set<ResourceType> argumentTypes = ((ReferencePath) argument).getResourceTypes();
    final ResourceType inputType = inputPath.getResourceType();
    checkUserInput(argumentTypes.contains(inputType), "Reference in argument to reverseResolve does not support input resource type: " + expression);
    // Do a left outer join from the input to the argument dataset using the reference field in the
    // argument.
    final Column joinCondition = referencePath.getResourceEquality(inputPath);
    final Dataset<Row> dataset = join(referencePath.getDataset(), inputPath.getDataset(), joinCondition, JoinType.RIGHT_OUTER);
    // Check the argument for information about the current resource that it originated from - if it
    // is not present, reverse reference resolution will not be possible.
    final NonLiteralPath nonLiteralArgument = (NonLiteralPath) argument;
    checkUserInput(nonLiteralArgument.getCurrentResource().isPresent(), "Argument to reverseResolve must be an element that is navigable from a " + "target resource type: " + expression);
    final ResourcePath currentResource = nonLiteralArgument.getCurrentResource().get();
    final Optional<Column> thisColumn = inputPath.getThisColumn();
    // TODO: Consider removing in the future once we separate ordering from element ID.
    // Create an synthetic element ID column for reverse resolved resources.
    final Column currentResourceValue = currentResource.getValueColumn();
    final WindowSpec windowSpec = Window.partitionBy(inputPath.getIdColumn(), inputPath.getOrderingColumn()).orderBy(currentResourceValue);
    // row_number() is 1-based, and we use 0-based indexes - thus (minus(1)).
    final Column currentResourceIndex = when(currentResourceValue.isNull(), lit(null)).otherwise(row_number().over(windowSpec).minus(lit(1)));
    // We need to add the synthetic EID column to the parser context so that it can be used within
    // joins in certain situations, e.g. extract.
    final Column syntheticEid = inputPath.expandEid(currentResourceIndex);
    final DatasetWithColumn datasetWithEid = QueryHelpers.createColumn(dataset, syntheticEid);
    input.getContext().getNodeIdColumns().putIfAbsent(expression, datasetWithEid.getColumn());
    final ResourcePath result = currentResource.copy(expression, datasetWithEid.getDataset(), inputPath.getIdColumn(), Optional.of(syntheticEid), currentResource.getValueColumn(), false, thisColumn);
    result.setCurrentResource(currentResource);
    return result;
}
Also used : FhirPath(au.csiro.pathling.fhirpath.FhirPath) ResourceType(org.hl7.fhir.r4.model.Enumerations.ResourceType) ResourcePath(au.csiro.pathling.fhirpath.ResourcePath) DatasetWithColumn(au.csiro.pathling.QueryHelpers.DatasetWithColumn) Column(org.apache.spark.sql.Column) DatasetWithColumn(au.csiro.pathling.QueryHelpers.DatasetWithColumn) ReferencePath(au.csiro.pathling.fhirpath.element.ReferencePath) Row(org.apache.spark.sql.Row) NonLiteralPath(au.csiro.pathling.fhirpath.NonLiteralPath) WindowSpec(org.apache.spark.sql.expressions.WindowSpec) Nonnull(javax.annotation.Nonnull)

Example 58 with FINAL

use of org.hl7.fhir.r4.model.Observation.ObservationStatus.FINAL in project pathling by aehrc.

the class TranslateFunction method invoke.

@Nonnull
@Override
public FhirPath invoke(@Nonnull final NamedFunctionInput input) {
    validateInput(input);
    final ElementPath inputPath = (ElementPath) input.getInput();
    final ParserContext inputContext = input.getContext();
    final Column idColumn = inputPath.getIdColumn();
    final Column conceptColumn = inputPath.getValueColumn();
    final boolean isCodeableConcept = isCodeableConcept(inputPath);
    final Column codingArrayCol = isCodeableConcept ? conceptColumn.getField("coding") : when(conceptColumn.isNotNull(), array(conceptColumn)).otherwise(lit(null));
    // The definition of the result is always the Coding element.
    @SuppressWarnings("OptionalGetWithoutIsPresent") final ElementDefinition resultDefinition = isCodeableConcept ? inputPath.getChildElement("coding").get() : inputPath.getDefinition().get();
    // Prepare the data which will be used within the map operation. All of these things must be
    // Serializable.
    @SuppressWarnings("OptionalGetWithoutIsPresent") final TerminologyServiceFactory terminologyServiceFactory = inputContext.getTerminologyServiceFactory().get();
    final Arguments arguments = Arguments.of(input);
    final String conceptMapUrl = arguments.getValue(0, String.class);
    final boolean reverse = arguments.getValueOr(1, DEFAULT_REVERSE);
    final String equivalence = arguments.getValueOr(2, DEFAULT_EQUIVALENCE);
    final Dataset<Row> dataset = inputPath.getDataset();
    final MapperWithPreview<List<SimpleCoding>, Row[], ConceptTranslator> mapper = new TranslateMapperWithPreview(MDC.get("requestId"), terminologyServiceFactory, conceptMapUrl, reverse, Strings.parseCsvList(equivalence, wrapInUserInputError(ConceptMapEquivalence::fromCode)));
    final Dataset<Row> translatedDataset = SqlExtensions.mapWithPartitionPreview(dataset, codingArrayCol, SimpleCodingsDecoders::decodeList, mapper, StructField.apply("result", DataTypes.createArrayType(CodingEncoding.DATA_TYPE), true, Metadata.empty()));
    // The result is an array of translations per each input element, which we now
    // need to explode in the same way as for path traversal, creating unique element ids.
    final MutablePair<Column, Column> valueAndEidColumns = new MutablePair<>();
    final Dataset<Row> resultDataset = inputPath.explodeArray(translatedDataset, translatedDataset.col("result"), valueAndEidColumns);
    // Construct a new result expression.
    final String expression = expressionFromInput(input, NAME);
    return ElementPath.build(expression, resultDataset, idColumn, Optional.of(valueAndEidColumns.getRight()), valueAndEidColumns.getLeft(), false, inputPath.getCurrentResource(), inputPath.getThisColumn(), resultDefinition);
}
Also used : TerminologyServiceFactory(au.csiro.pathling.fhir.TerminologyServiceFactory) MutablePair(org.apache.commons.lang3.tuple.MutablePair) ElementPath(au.csiro.pathling.fhirpath.element.ElementPath) Column(org.apache.spark.sql.Column) ConceptTranslator(au.csiro.pathling.terminology.ConceptTranslator) SimpleCodingsDecoders(au.csiro.pathling.fhirpath.encoding.SimpleCodingsDecoders) List(java.util.List) ConceptMapEquivalence(org.hl7.fhir.r4.model.Enumerations.ConceptMapEquivalence) ElementDefinition(au.csiro.pathling.fhirpath.element.ElementDefinition) Row(org.apache.spark.sql.Row) ParserContext(au.csiro.pathling.fhirpath.parser.ParserContext) Nonnull(javax.annotation.Nonnull)

Example 59 with FINAL

use of org.hl7.fhir.r4.model.Observation.ObservationStatus.FINAL in project pathling by aehrc.

the class ElementPath method getInstance.

@Nonnull
private static ElementPath getInstance(@Nonnull final String expression, @Nonnull final Dataset<Row> dataset, @Nonnull final Column idColumn, @Nonnull final Optional<Column> eidColumn, @Nonnull final Column valueColumn, final boolean singular, @Nonnull final Optional<ResourcePath> currentResource, @Nonnull final Optional<Column> thisColumn, @Nonnull final FHIRDefinedType fhirType) {
    // Look up the class that represents an element with the specified FHIR type.
    final Class<? extends ElementPath> elementPathClass = ElementDefinition.elementClassForType(fhirType).orElse(ElementPath.class);
    final DatasetWithColumnMap datasetWithColumns = eidColumn.map(eidCol -> createColumns(dataset, eidCol, valueColumn)).orElseGet(() -> createColumns(dataset, valueColumn));
    try {
        // Call its constructor and return.
        final Constructor<? extends ElementPath> constructor = elementPathClass.getDeclaredConstructor(String.class, Dataset.class, Column.class, Optional.class, Column.class, boolean.class, Optional.class, Optional.class, FHIRDefinedType.class);
        return constructor.newInstance(expression, datasetWithColumns.getDataset(), idColumn, eidColumn.map(datasetWithColumns::getColumn), datasetWithColumns.getColumn(valueColumn), singular, currentResource, thisColumn, fhirType);
    } catch (final NoSuchMethodException | InstantiationException | IllegalAccessException | InvocationTargetException e) {
        throw new RuntimeException("Problem building an ElementPath class", e);
    }
}
Also used : Getter(lombok.Getter) Dataset(org.apache.spark.sql.Dataset) NonLiteralPath(au.csiro.pathling.fhirpath.NonLiteralPath) FHIRDefinedType(org.hl7.fhir.r4.model.Enumerations.FHIRDefinedType) Column(org.apache.spark.sql.Column) QueryHelpers.createColumns(au.csiro.pathling.QueryHelpers.createColumns) Row(org.apache.spark.sql.Row) Constructor(java.lang.reflect.Constructor) ResourcePath(au.csiro.pathling.fhirpath.ResourcePath) InvocationTargetException(java.lang.reflect.InvocationTargetException) AccessLevel(lombok.AccessLevel) DatasetWithColumnMap(au.csiro.pathling.QueryHelpers.DatasetWithColumnMap) FhirPath(au.csiro.pathling.fhirpath.FhirPath) Optional(java.util.Optional) InvalidUserInputError(au.csiro.pathling.errors.InvalidUserInputError) Nonnull(javax.annotation.Nonnull) DatasetWithColumnMap(au.csiro.pathling.QueryHelpers.DatasetWithColumnMap) InvocationTargetException(java.lang.reflect.InvocationTargetException) Nonnull(javax.annotation.Nonnull)

Example 60 with FINAL

use of org.hl7.fhir.r4.model.Observation.ObservationStatus.FINAL in project pathling by aehrc.

the class ManifestConverter method populateScope.

void populateScope(@Nonnull final PassportScope passportScope, @Nonnull final VisaManifest manifest) {
    // Create a filter for the Patient resource.
    final String patientIdCollection = manifest.getPatientIds().stream().map(id -> "'" + id + "'").collect(Collectors.joining(" combine "));
    final String patientIdFilter = "identifier.where(system = '" + StringLiteralPath.escapeFhirPathString(patientIdSystem) + "').where(value in (" + patientIdCollection + "))" + ".empty().not()";
    final Set<String> patientFilters = passportScope.get(ResourceType.PATIENT);
    if (patientFilters == null) {
        passportScope.put(ResourceType.PATIENT, new HashSet<>(List.of(patientIdFilter)));
    } else {
        patientFilters.add(patientIdFilter);
    }
    // See: https://www.hl7.org/fhir/r4/compartmentdefinition-patient.html
    for (final ResourceType resourceType : ResourceType.values()) {
        if (resourceType.equals(ResourceType.DOMAINRESOURCE) || resourceType.equals(ResourceType.RESOURCE) || resourceType.equals(ResourceType.NULL)) {
            continue;
        }
        final RuntimeResourceDefinition definition = fhirContext.getResourceDefinition(resourceType.toCode());
        final List<RuntimeSearchParam> searchParams = definition.getSearchParamsForCompartmentName("Patient");
        for (final RuntimeSearchParam searchParam : searchParams) {
            final String path = searchParam.getPath();
            // Remove the leading "[resource type]." from the path.
            final String pathTrimmed = path.replaceFirst("^" + resourceType.toCode() + "\\.", "");
            // Paths that end with this resolve pattern are polymorphic references, and will need
            // to be resolved using `ofType()` within our implementation.
            final String resolvePattern = ".where(resolve() is Patient)";
            final String filter;
            if (pathTrimmed.endsWith(resolvePattern)) {
                filter = pathTrimmed.replace(resolvePattern, ".resolve().ofType(Patient)." + patientIdFilter);
            } else {
                final Set<String> targets = searchParam.getTargets();
                if (targets.size() > 0 && !targets.contains("Patient")) {
                    // Patient, we need to skip it altogether.
                    continue;
                } else if (targets.size() == 1) {
                    // If the search parameter is monomorphic, we can resolve it without `ofType`.
                    filter = pathTrimmed + ".resolve()." + patientIdFilter;
                } else {
                    // If the search parameter is polymorphic, we also need to resolve it to Patient. Note
                    // that polymorphic references with an "Any" type have zero targets.
                    filter = pathTrimmed + ".resolve().ofType(Patient)." + patientIdFilter;
                }
            }
            // Add the filter to the map.
            final Set<String> filters = passportScope.get(resourceType);
            if (filters == null) {
                passportScope.put(resourceType, new HashSet<>(List.of(filter)));
            } else {
                filters.add(filter);
            }
        }
    }
}
Also used : RuntimeSearchParam(ca.uhn.fhir.context.RuntimeSearchParam) Configuration(au.csiro.pathling.Configuration) Set(java.util.Set) ResourceType(org.hl7.fhir.r4.model.Enumerations.ResourceType) Collectors(java.util.stream.Collectors) Profile(org.springframework.context.annotation.Profile) HashSet(java.util.HashSet) FhirContext(ca.uhn.fhir.context.FhirContext) List(java.util.List) Component(org.springframework.stereotype.Component) StringLiteralPath(au.csiro.pathling.fhirpath.literal.StringLiteralPath) RuntimeResourceDefinition(ca.uhn.fhir.context.RuntimeResourceDefinition) Nonnull(javax.annotation.Nonnull) RuntimeResourceDefinition(ca.uhn.fhir.context.RuntimeResourceDefinition) ResourceType(org.hl7.fhir.r4.model.Enumerations.ResourceType) RuntimeSearchParam(ca.uhn.fhir.context.RuntimeSearchParam)

Aggregations

Test (org.junit.jupiter.api.Test)229 SpringBootTest (org.springframework.boot.test.context.SpringBootTest)85 HashMap (java.util.HashMap)83 CamelSpringBootTest (org.apache.camel.test.spring.junit5.CamelSpringBootTest)59 List (java.util.List)53 Bundle (org.hl7.fhir.dstu3.model.Bundle)50 Nonnull (javax.annotation.Nonnull)48 Patient (org.hl7.fhir.dstu3.model.Patient)46 Organization (org.hl7.fhir.dstu3.model.Organization)45 ArrayList (java.util.ArrayList)44 Bundle (org.hl7.fhir.r4.model.Bundle)41 IBaseResource (org.hl7.fhir.instance.model.api.IBaseResource)39 UUID (java.util.UUID)38 Collectors (java.util.stream.Collectors)38 Coding (org.hl7.fhir.r4.model.Coding)34 FhirContext (ca.uhn.fhir.context.FhirContext)33 IGenericClient (ca.uhn.fhir.rest.client.api.IGenericClient)32 IParser (ca.uhn.fhir.parser.IParser)31 IOException (java.io.IOException)29 IdType (org.hl7.fhir.dstu3.model.IdType)28