Search in sources :

Example 1 with UploadSchema

use of org.sagebionetworks.bridge.models.upload.UploadSchema in project BridgeServer2 by Sage-Bionetworks.

the class UploadSchemaService method createUploadSchemaFromSurvey.

/**
 * <p>
 * Creates an upload schema from a survey. This is generally called when a survey is published, to
 * create the corresponding upload schema, so that health data records can be created from survey responses.
 * This method will also persist the schema to the backing store.
 * <p>
 * If newSchemaRev is true, this method will always create a new schema revision. If false, it will attempt to
 * modify the existing schema revision. However, if the schema revisions are not compatible, it will fall back to
 * creating a new schema revision.
 * </p>
 */
public UploadSchema createUploadSchemaFromSurvey(String appId, Survey survey, boolean newSchemaRev) {
    // https://sagebionetworks.jira.com/browse/BRIDGE-1698 - If the existing Schema ID points to a different survey
    // or a non-survey, this is an error. Having multiple surveys point to the same schema ID causes really bad
    // things to happen, and we need to prevent it.
    String schemaId = survey.getIdentifier();
    UploadSchema oldSchema = getUploadSchemaNoThrow(appId, schemaId);
    if (oldSchema != null) {
        if (oldSchema.getSchemaType() != UploadSchemaType.IOS_SURVEY || !Objects.equals(oldSchema.getSurveyGuid(), survey.getGuid())) {
            throw new BadRequestException("Survey with identifier " + schemaId + " conflicts with schema with the same ID. Please use a different survey identifier.");
        }
    }
    // the same survey.
    if (!newSchemaRev && oldSchema != null) {
        // Check that the old schema already has the answers field.
        List<UploadFieldDefinition> oldFieldDefList = oldSchema.getFieldDefinitions();
        UploadFieldDefinition answersFieldDef = getElement(oldFieldDefList, UploadFieldDefinition::getName, FIELD_ANSWERS).orElse(null);
        if (answersFieldDef == null) {
            // Old schema doesn't have the
            List<UploadFieldDefinition> newFieldDefList = new ArrayList<>(oldFieldDefList);
            newFieldDefList.add(UploadUtil.ANSWERS_FIELD_DEF);
            addSurveySchemaMetadata(oldSchema, survey);
            oldSchema.setFieldDefinitions(newFieldDefList);
            return updateSchemaRevisionV4(appId, schemaId, oldSchema.getRevision(), oldSchema);
        }
        // Answers field needs to be either
        // (a) an attachment (Large Text or normal)
        // (b) a string with isUnboundedLength=true
        UploadFieldType fieldType = answersFieldDef.getType();
        if (fieldType == UploadFieldType.LARGE_TEXT_ATTACHMENT || UploadFieldType.ATTACHMENT_TYPE_SET.contains(fieldType) || (UploadFieldType.STRING_TYPE_SET.contains(fieldType) && Boolean.TRUE.equals(answersFieldDef.isUnboundedText()))) {
            // The old schema works for the new survey. However, we want to ensure the old schema points to the
            // latest version of the survey. Update survey metadata in the schema.
            addSurveySchemaMetadata(oldSchema, survey);
            return updateSchemaRevisionV4(appId, schemaId, oldSchema.getRevision(), oldSchema);
        }
    // If execution gets this far, that means we have a schema with an "answers" field that's not compatible.
    // At this point, we go into the branch that creates a new schema, below.
    }
    // We were unable to reconcile this with the existing schema. Create a new schema. (Create API will
    // automatically bump the rev number if an old schema revision exists.)
    UploadSchema schemaToCreate = UploadSchema.create();
    addSurveySchemaMetadata(schemaToCreate, survey);
    schemaToCreate.setFieldDefinitions(ImmutableList.of(UploadUtil.ANSWERS_FIELD_DEF));
    return createSchemaRevisionV4(appId, schemaToCreate);
}
Also used : UploadFieldDefinition(org.sagebionetworks.bridge.models.upload.UploadFieldDefinition) UploadFieldType(org.sagebionetworks.bridge.models.upload.UploadFieldType) ArrayList(java.util.ArrayList) BadRequestException(org.sagebionetworks.bridge.exceptions.BadRequestException) UploadSchema(org.sagebionetworks.bridge.models.upload.UploadSchema)

Example 2 with UploadSchema

use of org.sagebionetworks.bridge.models.upload.UploadSchema in project BridgeServer2 by Sage-Bionetworks.

the class UploadSchemaService method deleteUploadSchemaByIdAndRevision.

/**
 * Service handler for deleting an upload schema with the specified app, schema ID, and revision. If the schema
 * doesn't exist, this API throws an EntityNotFoundException.
 */
public void deleteUploadSchemaByIdAndRevision(String appId, String schemaId, int rev) {
    // Schema ID and rev are validated by getUploadSchemaByIdAndRev()
    UploadSchema schema = getRevisionForDeletion(appId, schemaId, rev);
    if (schema == null || schema.isDeleted()) {
        throw new EntityNotFoundException(UploadSchema.class);
    }
    uploadSchemaDao.deleteUploadSchemas(ImmutableList.of(schema));
}
Also used : UploadSchema(org.sagebionetworks.bridge.models.upload.UploadSchema) EntityNotFoundException(org.sagebionetworks.bridge.exceptions.EntityNotFoundException)

Example 3 with UploadSchema

use of org.sagebionetworks.bridge.models.upload.UploadSchema in project BridgeServer2 by Sage-Bionetworks.

the class UploadSchemaService method getUploadSchemasForApp.

/**
 * Service handler for fetching the most recent revision of all upload schemas in a app.
 */
public List<UploadSchema> getUploadSchemasForApp(String appId, boolean includeDeleted) {
    // Get all schemas. No simple query for just latest schemas.
    List<UploadSchema> allSchemasAllRevisions = getAllUploadSchemasAllRevisions(appId, includeDeleted);
    // Iterate schemas and pick out latest for each schema ID.
    // Find the most recent version of each schema with a unique schemaId
    Map<String, UploadSchema> schemaMap = new HashMap<>();
    for (UploadSchema schema : allSchemasAllRevisions) {
        UploadSchema existing = schemaMap.get(schema.getSchemaId());
        if (existing == null || schema.getRevision() > existing.getRevision()) {
            schemaMap.put(schema.getSchemaId(), schema);
        }
    }
    // Do we care if it's sorted? What would it be sorted by?
    return ImmutableList.copyOf(schemaMap.values());
}
Also used : HashMap(java.util.HashMap) UploadSchema(org.sagebionetworks.bridge.models.upload.UploadSchema)

Example 4 with UploadSchema

use of org.sagebionetworks.bridge.models.upload.UploadSchema in project BridgeServer2 by Sage-Bionetworks.

the class UploadHandlersEndToEndTest method testNonSurvey.

private void testNonSurvey(String infoJsonText) throws Exception {
    // set up schema
    List<UploadFieldDefinition> fieldDefList = ImmutableList.of(new UploadFieldDefinition.Builder().withName("CCC.txt").withType(UploadFieldType.ATTACHMENT_BLOB).build(), new UploadFieldDefinition.Builder().withName("DDD.csv").withType(UploadFieldType.ATTACHMENT_CSV).build(), new UploadFieldDefinition.Builder().withName("EEE.json").withType(UploadFieldType.ATTACHMENT_JSON_BLOB).build(), new UploadFieldDefinition.Builder().withName("FFF.json").withType(UploadFieldType.ATTACHMENT_JSON_TABLE).build(), new UploadFieldDefinition.Builder().withName("GGG.txt").withType(UploadFieldType.ATTACHMENT_V2).build(), new UploadFieldDefinition.Builder().withName("record.json.HHH").withType(UploadFieldType.ATTACHMENT_V2).build(), new UploadFieldDefinition.Builder().withName("record.json.III").withType(UploadFieldType.BOOLEAN).build(), new UploadFieldDefinition.Builder().withName("record.json.JJJ").withType(UploadFieldType.CALENDAR_DATE).build(), new UploadFieldDefinition.Builder().withName("record.json.LLL").withType(UploadFieldType.DURATION_V2).build(), new UploadFieldDefinition.Builder().withName("record.json.MMM").withType(UploadFieldType.FLOAT).build(), new UploadFieldDefinition.Builder().withName("record.json.NNN").withType(UploadFieldType.INLINE_JSON_BLOB).build(), new UploadFieldDefinition.Builder().withName("record.json.OOO").withType(UploadFieldType.INT).build(), new UploadFieldDefinition.Builder().withName("record.json.PPP").withType(UploadFieldType.STRING).build(), new UploadFieldDefinition.Builder().withName("record.json.QQQ").withType(UploadFieldType.TIME_V2).build(), new UploadFieldDefinition.Builder().withName("record.json.arrr").withType(UploadFieldType.TIMESTAMP).build(), new UploadFieldDefinition.Builder().withName("empty_attachment").withType(UploadFieldType.ATTACHMENT_V2).withRequired(false).build());
    UploadSchema schema = UploadSchema.create();
    schema.setFieldDefinitions(fieldDefList);
    schema.setName(SCHEMA_NAME);
    schema.setRevision(SCHEMA_REV);
    schema.setSchemaId(SCHEMA_ID);
    schema.setSchemaType(UploadSchemaType.IOS_DATA);
    schema.setAppId(TEST_APP_ID);
    // set up upload files
    String cccTxtContent = "Blob file";
    String dddCsvContent = "foo,bar\nbaz,qux";
    String eeeJsonContent = "{\"key\":\"value\"}";
    String fffJsonContent = "[{\"name\":\"Dwayne\"},{\"name\":\"Eggplant\"}]";
    String gggTxtContent = "Arbitrary attachment";
    // Note that a lot of these have the wrong type, but are convertible to the correct type. This is to test that
    // values can be canonicalized.
    String recordJsonContent = "{\n" + "   \"HHH\":[\"attachment\", \"inside\", \"file\"],\n" + "   \"III\":1,\n" + "   \"JJJ\":\"2016-06-03T17:03-0700\",\n" + "   \"LLL\":\"PT1H\",\n" + "   \"MMM\":\"3.14\",\n" + "   \"NNN\":[\"inline\", \"json\"],\n" + "   \"OOO\":\"2.718\",\n" + "   \"PPP\":1337,\n" + "   \"QQQ\":\"2016-06-03T19:21:35.378-0700\",\n" + "   \"arrr\":\"2016-06-03T18:12:34.567+0900\"\n" + "}";
    Map<String, String> fileMap = ImmutableMap.<String, String>builder().put("info.json", infoJsonText).put("CCC.txt", cccTxtContent).put("DDD.csv", dddCsvContent).put("EEE.json", eeeJsonContent).put("FFF.json", fffJsonContent).put("GGG.txt", gggTxtContent).put("record.json", recordJsonContent).put("empty_attachment", "").build();
    // execute
    test(schema, null, fileMap, null);
    // verify created record
    ArgumentCaptor<HealthDataRecord> recordCaptor = ArgumentCaptor.forClass(HealthDataRecord.class);
    verify(mockHealthDataService).createOrUpdateRecord(recordCaptor.capture());
    HealthDataRecord record = recordCaptor.getValue();
    validateCommonRecordProps(record);
    assertEquals(record.getSchemaId(), SCHEMA_ID);
    assertEquals(record.getSchemaRevision().intValue(), SCHEMA_REV);
    JsonNode dataNode = record.getData();
    assertEquals(dataNode.size(), 15);
    assertTrue(dataNode.get("record.json.III").booleanValue());
    assertEquals(dataNode.get("record.json.JJJ").textValue(), "2016-06-03");
    assertEquals(dataNode.get("record.json.LLL").textValue(), "PT1H");
    assertEquals(dataNode.get("record.json.MMM").doubleValue(), 3.14, /*delta*/
    0.001);
    assertEquals(dataNode.get("record.json.OOO").intValue(), 2);
    assertEquals(dataNode.get("record.json.PPP").textValue(), "1337");
    assertEquals(dataNode.get("record.json.QQQ").textValue(), "19:21:35.378");
    assertEquals(DateTime.parse(dataNode.get("record.json.arrr").textValue()), DateTime.parse("2016-06-03T18:12:34.567+0900"));
    JsonNode nnnNode = dataNode.get("record.json.NNN");
    assertEquals(nnnNode.size(), 2);
    assertEquals(nnnNode.get(0).textValue(), "inline");
    assertEquals(nnnNode.get(1).textValue(), "json");
    // validate attachment content in S3
    String cccTxtAttachmentId = dataNode.get("CCC.txt").textValue();
    validateTextAttachment(cccTxtContent, cccTxtAttachmentId);
    String dddCsvAttachmentId = dataNode.get("DDD.csv").textValue();
    validateTextAttachment(dddCsvContent, dddCsvAttachmentId);
    String gggTxtAttachmentId = dataNode.get("GGG.txt").textValue();
    validateTextAttachment(gggTxtContent, gggTxtAttachmentId);
    String eeeJsonAttachmentId = dataNode.get("EEE.json").textValue();
    byte[] eeeJsonUploadedContent = uploadedFileContentMap.get(eeeJsonAttachmentId);
    JsonNode eeeJsonNode = BridgeObjectMapper.get().readTree(eeeJsonUploadedContent);
    assertEquals(eeeJsonNode.size(), 1);
    assertEquals(eeeJsonNode.get("key").textValue(), "value");
    String fffJsonAttachmentId = dataNode.get("FFF.json").textValue();
    byte[] fffJsonUploadedContent = uploadedFileContentMap.get(fffJsonAttachmentId);
    JsonNode fffJsonNode = BridgeObjectMapper.get().readTree(fffJsonUploadedContent);
    assertEquals(fffJsonNode.size(), 2);
    assertEquals(fffJsonNode.get(0).size(), 1);
    assertEquals(fffJsonNode.get(0).get("name").textValue(), "Dwayne");
    assertEquals(fffJsonNode.get(1).size(), 1);
    assertEquals(fffJsonNode.get(1).get("name").textValue(), "Eggplant");
    String hhhAttachmentId = dataNode.get("record.json.HHH").textValue();
    ArgumentCaptor<byte[]> attachmentContentCaptor = ArgumentCaptor.forClass(byte[].class);
    verify(mockS3UploadHelper).writeBytesToS3(eq(TestConstants.ATTACHMENT_BUCKET), eq(hhhAttachmentId), attachmentContentCaptor.capture(), metadataCaptor.capture());
    JsonNode hhhNode = BridgeObjectMapper.get().readTree(attachmentContentCaptor.getValue());
    assertEquals(hhhNode.size(), 3);
    assertEquals(hhhNode.get(0).textValue(), "attachment");
    assertEquals(hhhNode.get(1).textValue(), "inside");
    assertEquals(hhhNode.get(2).textValue(), "file");
    assertEquals(metadataCaptor.getValue().getSSEAlgorithm(), ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
    // We upload the unencrypted zipped file back to S3.
    validateRawDataAttachment(RAW_ZIP_FILENAME);
    // verify upload dao write validation status
    verify(mockUploadDao).writeValidationStatus(upload, UploadStatus.SUCCEEDED, ImmutableList.of(), UPLOAD_ID);
}
Also used : UploadFieldDefinition(org.sagebionetworks.bridge.models.upload.UploadFieldDefinition) HealthDataRecord(org.sagebionetworks.bridge.models.healthdata.HealthDataRecord) JsonNode(com.fasterxml.jackson.databind.JsonNode) UploadSchema(org.sagebionetworks.bridge.models.upload.UploadSchema)

Example 5 with UploadSchema

use of org.sagebionetworks.bridge.models.upload.UploadSchema in project BridgeServer2 by Sage-Bionetworks.

the class HealthDataService method submitHealthData.

/* HEALTH DATA SUBMISSION */
/**
 * Synchronous health data API. Used to submit small health data payloads (such as survey responses) without
 * incurring the overhead of creating a bunch of small files to upload to S3.
 */
public HealthDataRecord submitHealthData(String appId, StudyParticipant participant, HealthDataSubmission healthDataSubmission) throws IOException, UploadValidationException {
    // validate health data submission
    if (healthDataSubmission == null) {
        throw new InvalidEntityException("Health data submission cannot be null");
    }
    Validate.entityThrowingException(HealthDataSubmissionValidator.INSTANCE, healthDataSubmission);
    // Generate a new uploadId.
    String uploadId = BridgeUtils.generateGuid();
    // construct health data record
    HealthDataRecord record = makeRecordFromSubmission(appId, participant, healthDataSubmission);
    // get schema
    UploadSchema schema = getSchemaForSubmission(appId, healthDataSubmission);
    if (schema != null) {
        // sanitize field names in the data node
        JsonNode sanitizedData = sanitizeFieldNames(healthDataSubmission.getData());
        // Filter data fields and attachments based on schema fields.
        filterAttachments(uploadId, schema, sanitizedData, record);
    }
    // Construct UploadValidationContext for the remaining upload handlers. We don't need all the fields, just the
    // ones that these handlers will be using.
    UploadValidationContext uploadValidationContext = new UploadValidationContext();
    uploadValidationContext.setHealthCode(participant.getHealthCode());
    uploadValidationContext.setHealthDataRecord(record);
    uploadValidationContext.setAppId(appId);
    // For back-compat reasons, we need to make a dummy upload to store the uploadId. This will never be persisted.
    // We just need a way to signal the Upload Validation pipeline to use this uploadId.
    Upload upload = Upload.create();
    upload.setUploadId(uploadId);
    uploadValidationContext.setUpload(upload);
    // Strict Validation Handler. If this throws, this is an invalid upload (400).
    try {
        strictValidationHandler.handle(uploadValidationContext);
    } catch (UploadValidationException ex) {
        throw new BadRequestException(ex);
    }
    // Transcribe Consent.
    transcribeConsentHandler.handle(uploadValidationContext);
    // Upload raw JSON as the raw data attachment. This is different from how the upload validation handles it.
    // Attachment ID is "[uploadId]-raw.json".
    String rawDataAttachmentId = uploadId + RAW_ATTACHMENT_SUFFIX;
    String rawDataValue = BridgeObjectMapper.get().writerWithDefaultPrettyPrinter().writeValueAsString(healthDataSubmission.getData());
    byte[] rawDataBytes = rawDataValue.getBytes(Charsets.UTF_8);
    uploadFileHelper.uploadBytesAsAttachment(rawDataAttachmentId, rawDataBytes);
    record.setRawDataAttachmentId(rawDataAttachmentId);
    // Upload Artifacts.
    uploadArtifactsHandler.handle(uploadValidationContext);
    // (depending on attachments). So we need to use the record ID to fetch the record again and return it.
    return getRecordById(uploadValidationContext.getRecordId());
}
Also used : UploadValidationException(org.sagebionetworks.bridge.upload.UploadValidationException) UploadValidationContext(org.sagebionetworks.bridge.upload.UploadValidationContext) HealthDataRecord(org.sagebionetworks.bridge.models.healthdata.HealthDataRecord) Upload(org.sagebionetworks.bridge.models.upload.Upload) BadRequestException(org.sagebionetworks.bridge.exceptions.BadRequestException) JsonNode(com.fasterxml.jackson.databind.JsonNode) UploadSchema(org.sagebionetworks.bridge.models.upload.UploadSchema) InvalidEntityException(org.sagebionetworks.bridge.exceptions.InvalidEntityException)

Aggregations

UploadSchema (org.sagebionetworks.bridge.models.upload.UploadSchema)122 Test (org.testng.annotations.Test)80 DynamoUploadSchema (org.sagebionetworks.bridge.dynamodb.DynamoUploadSchema)22 ObjectNode (com.fasterxml.jackson.databind.node.ObjectNode)18 UploadFieldDefinition (org.sagebionetworks.bridge.models.upload.UploadFieldDefinition)15 UploadSchemaService (org.sagebionetworks.bridge.services.UploadSchemaService)15 Survey (org.sagebionetworks.bridge.models.surveys.Survey)14 JsonNode (com.fasterxml.jackson.databind.JsonNode)9 GuidCreatedOnVersionHolderImpl (org.sagebionetworks.bridge.models.GuidCreatedOnVersionHolderImpl)9 HealthDataRecord (org.sagebionetworks.bridge.models.healthdata.HealthDataRecord)8 ArrayList (java.util.ArrayList)7 BadRequestException (org.sagebionetworks.bridge.exceptions.BadRequestException)7 UserSession (org.sagebionetworks.bridge.models.accounts.UserSession)7 Mockito.anyString (org.mockito.Mockito.anyString)6 EntityNotFoundException (org.sagebionetworks.bridge.exceptions.EntityNotFoundException)6 DynamoSurvey (org.sagebionetworks.bridge.dynamodb.DynamoSurvey)5 ClientInfo (org.sagebionetworks.bridge.models.ClientInfo)4 SchemaReference (org.sagebionetworks.bridge.models.schedules.SchemaReference)3 SharedModuleImportStatus (org.sagebionetworks.bridge.models.sharedmodules.SharedModuleImportStatus)3 SurveyService (org.sagebionetworks.bridge.services.SurveyService)3