Search in sources :

Example 6 with DataSchemaResolver

use of com.linkedin.data.schema.DataSchemaResolver in project rest.li by linkedin.

the class PdlEncoderTest method parseSchema.

private DataSchema parseSchema(String text, String name) throws IOException {
    DataSchemaResolver resolver = MultiFormatDataSchemaResolver.withBuiltinFormats(pegasusSrcDir.getAbsolutePath());
    AbstractSchemaParser parser = new PdlSchemaParser(resolver);
    parser.parse(text);
    return extractSchema(parser, name);
}
Also used : MultiFormatDataSchemaResolver(com.linkedin.data.schema.resolver.MultiFormatDataSchemaResolver) DataSchemaResolver(com.linkedin.data.schema.DataSchemaResolver) AbstractSchemaParser(com.linkedin.data.schema.AbstractSchemaParser) PdlSchemaParser(com.linkedin.data.schema.grammar.PdlSchemaParser)

Example 7 with DataSchemaResolver

use of com.linkedin.data.schema.DataSchemaResolver in project rest.li by linkedin.

the class RestLiSnapshotCompatibilityChecker method checkCompatibility.

private CompatibilityInfoMap checkCompatibility(String prevRestModelPath, String currRestModelPath, CompatibilityLevel compatLevel, boolean isAgainstRestSpec) {
    final CompatibilityInfoMap infoMap = _infoMap;
    if (compatLevel == CompatibilityLevel.OFF) {
        // skip check entirely.
        return infoMap;
    }
    final Stack<Object> path = new Stack<>();
    path.push("");
    FileInputStream prevSnapshotFile = null;
    FileInputStream currSnapshotFile = null;
    try {
        prevSnapshotFile = new FileInputStream(prevRestModelPath);
    } catch (FileNotFoundException e) {
        infoMap.addRestSpecInfo(CompatibilityInfo.Type.RESOURCE_NEW, path, currRestModelPath);
    }
    try {
        currSnapshotFile = new FileInputStream(currRestModelPath);
    } catch (FileNotFoundException e) {
        infoMap.addRestSpecInfo(CompatibilityInfo.Type.RESOURCE_MISSING, path, prevRestModelPath);
    }
    if (prevSnapshotFile == null || currSnapshotFile == null) {
        return infoMap;
    }
    AbstractSnapshot prevSnapshot = null;
    AbstractSnapshot currSnapshot = null;
    try {
        if (isAgainstRestSpec) {
            prevSnapshot = new RestSpec(prevSnapshotFile);
        } else {
            prevSnapshot = new Snapshot(prevSnapshotFile);
        }
        currSnapshot = new Snapshot(currSnapshotFile);
    } catch (IOException e) {
        infoMap.addRestSpecInfo(CompatibilityInfo.Type.OTHER_ERROR, path, e.getMessage());
    }
    if (prevSnapshot == null || currSnapshot == null) {
        return infoMap;
    }
    final DataSchemaResolver currResolver = createResolverFromSnapshot(currSnapshot, _resolverPath);
    final DataSchemaResolver prevResolver;
    if (isAgainstRestSpec) {
        prevResolver = currResolver;
    } else {
        prevResolver = createResolverFromSnapshot(prevSnapshot, _resolverPath);
    }
    final ResourceCompatibilityChecker checker = new ResourceCompatibilityChecker(prevSnapshot.getResourceSchema(), prevResolver, currSnapshot.getResourceSchema(), currResolver);
    checker.check(compatLevel);
    infoMap.addAll(checker.getInfoMap());
    return infoMap;
}
Also used : DataSchemaResolver(com.linkedin.data.schema.DataSchemaResolver) FileNotFoundException(java.io.FileNotFoundException) ResourceCompatibilityChecker(com.linkedin.restli.tools.compatibility.ResourceCompatibilityChecker) IOException(java.io.IOException) CompatibilityInfoMap(com.linkedin.restli.tools.compatibility.CompatibilityInfoMap) FileInputStream(java.io.FileInputStream) Stack(java.util.Stack)

Example 8 with DataSchemaResolver

use of com.linkedin.data.schema.DataSchemaResolver in project rest.li by linkedin.

the class SchemaTranslator method avroToDataSchema.

/**
 * Translate an Avro {@link Schema} to a {@link DataSchema}.
 * <p>
 * If the translation mode is {@link AvroToDataSchemaTranslationMode#RETURN_EMBEDDED_SCHEMA}
 * and a {@link DataSchema} is embedded in the Avro schema, then return the embedded schema.
 * An embedded schema is present if the Avro {@link Schema} has a "com.linkedin.data" property and the
 * "com.linkedin.data" property contains both "schema" and "optionalDefaultMode" properties.
 * The "schema" property provides the embedded {@link DataSchema}.
 * The "optionalDefaultMode" property provides how optional default values were translated.
 * <p>
 * If the translation mode is {@link AvroToDataSchemaTranslationMode#VERIFY_EMBEDDED_SCHEMA}
 * and a {@link DataSchema} is embedded in the Avro schema, then verify that the embedded schema
 * translates to the input Avro schema. If the translated and embedded schema is the same,
 * then return the embedded schema, else throw {@link IllegalArgumentException}.
 * <p>
 * If the translation mode is {@link com.linkedin.data.avro.AvroToDataSchemaTranslationMode#TRANSLATE}
 * or no embedded {@link DataSchema} is present, then this method
 * translates the provided Avro {@link Schema} to a {@link DataSchema}
 * as described follows:
 * <p>
 * This method translates union with null record fields in Avro {@link Schema}
 * to optional fields in {@link DataSchema}. Record fields
 * whose type is a union with null will be translated to a new type, and the field becomes optional.
 * If the Avro union has two types (one of them is the null type), then the new type of the
 * field is the non-null member type of the union. If the Avro union does not have two types
 * (one of them is the null type) then the new type of the field is a union type with the null type
 * removed from the original union.
 * <p>
 * This method also translates default values. If the field's type is a union with null
 * and has a default value, then this method also translates the default value of the field
 * to comply with the new type of the field. If the default value is null,
 * then remove the default value. If new type is not a union and the default value
 * is of the non-null member type, then assign the default value to the
 * non-null value within the union value (i.e. the value of the only entry within the
 * JSON object.) If the new type is a union and the default value is of the
 * non-null member type, then assign the default value to a JSON object
 * containing a single entry with the key being the member type discriminator of
 * the first union member and the value being the actual member value.
 * <p>
 * Both the schema and default value translation takes into account that default value
 * representation for Avro unions does not include the member type discriminator and
 * the type of the default value is always the 1st member of the union.
 *
 * @param avroSchemaInJson provides the JSON representation of the Avro {@link Schema}.
 * @param options specifies the {@link AvroToDataSchemaTranslationOptions}.
 * @return the translated {@link DataSchema}.
 * @throws IllegalArgumentException if the Avro {@link Schema} cannot be translated.
 */
public static DataSchema avroToDataSchema(String avroSchemaInJson, AvroToDataSchemaTranslationOptions options) throws IllegalArgumentException {
    ValidationOptions validationOptions = SchemaParser.getDefaultSchemaParserValidationOptions();
    validationOptions.setAvroUnionMode(true);
    SchemaParserFactory parserFactory = SchemaParserFactory.instance(validationOptions);
    DataSchemaResolver resolver = getResolver(parserFactory, options);
    PegasusSchemaParser parser = parserFactory.create(resolver);
    parser.parse(avroSchemaInJson);
    if (parser.hasError()) {
        throw new IllegalArgumentException(parser.errorMessage());
    }
    assert (parser.topLevelDataSchemas().size() == 1);
    DataSchema dataSchema = parser.topLevelDataSchemas().get(0);
    DataSchema resultDataSchema = null;
    AvroToDataSchemaTranslationMode translationMode = options.getTranslationMode();
    if (translationMode == AvroToDataSchemaTranslationMode.RETURN_EMBEDDED_SCHEMA || translationMode == AvroToDataSchemaTranslationMode.VERIFY_EMBEDDED_SCHEMA) {
        // check for embedded schema
        Object dataProperty = dataSchema.getProperties().get(SchemaTranslator.DATA_PROPERTY);
        if (dataProperty != null && dataProperty.getClass() == DataMap.class) {
            Object schemaProperty = ((DataMap) dataProperty).get(SchemaTranslator.SCHEMA_PROPERTY);
            if (schemaProperty.getClass() == DataMap.class) {
                SchemaParser embeddedSchemaParser = SchemaParserFactory.instance().create(null);
                embeddedSchemaParser.parse(Arrays.asList(schemaProperty));
                if (embeddedSchemaParser.hasError()) {
                    throw new IllegalArgumentException("Embedded schema is invalid\n" + embeddedSchemaParser.errorMessage());
                }
                assert (embeddedSchemaParser.topLevelDataSchemas().size() == 1);
                resultDataSchema = embeddedSchemaParser.topLevelDataSchemas().get(0);
                if (translationMode == AvroToDataSchemaTranslationMode.VERIFY_EMBEDDED_SCHEMA) {
                    // additional verification to make sure that embedded schema translates to Avro schema
                    DataToAvroSchemaTranslationOptions dataToAvroSchemaOptions = new DataToAvroSchemaTranslationOptions();
                    Object optionalDefaultModeProperty = ((DataMap) dataProperty).get(SchemaTranslator.OPTIONAL_DEFAULT_MODE_PROPERTY);
                    dataToAvroSchemaOptions.setOptionalDefaultMode(OptionalDefaultMode.valueOf(optionalDefaultModeProperty.toString()));
                    Schema avroSchemaFromEmbedded = dataToAvroSchema(resultDataSchema, dataToAvroSchemaOptions);
                    Schema avroSchemaFromJson = Schema.parse(avroSchemaInJson);
                    if (!avroSchemaFromEmbedded.equals(avroSchemaFromJson)) {
                        throw new IllegalArgumentException("Embedded schema does not translate to input Avro schema: " + avroSchemaInJson);
                    }
                }
            }
        }
    }
    if (resultDataSchema == null) {
        // translationMode == TRANSLATE or no embedded schema
        DataSchemaTraverse traverse = new DataSchemaTraverse();
        traverse.traverse(dataSchema, AvroToDataSchemaConvertCallback.INSTANCE);
        // convert default values
        traverse.traverse(dataSchema, DefaultAvroToDataConvertCallback.INSTANCE);
        // make sure it can round-trip
        String dataSchemaJson = dataSchema.toString();
        resultDataSchema = DataTemplateUtil.parseSchema(dataSchemaJson);
    }
    return resultDataSchema;
}
Also used : PegasusSchemaParser(com.linkedin.data.schema.PegasusSchemaParser) SchemaParserFactory(com.linkedin.data.schema.SchemaParserFactory) Schema(org.apache.avro.Schema) DataSchema(com.linkedin.data.schema.DataSchema) RecordDataSchema(com.linkedin.data.schema.RecordDataSchema) ValidationOptions(com.linkedin.data.schema.validation.ValidationOptions) SchemaParser(com.linkedin.data.schema.SchemaParser) PegasusSchemaParser(com.linkedin.data.schema.PegasusSchemaParser) DataMap(com.linkedin.data.DataMap) DataSchema(com.linkedin.data.schema.DataSchema) RecordDataSchema(com.linkedin.data.schema.RecordDataSchema) FileDataSchemaResolver(com.linkedin.data.schema.resolver.FileDataSchemaResolver) DataSchemaResolver(com.linkedin.data.schema.DataSchemaResolver) DefaultDataSchemaResolver(com.linkedin.data.schema.resolver.DefaultDataSchemaResolver) DataSchemaTraverse(com.linkedin.data.schema.DataSchemaTraverse)

Example 9 with DataSchemaResolver

use of com.linkedin.data.schema.DataSchemaResolver in project rest.li by linkedin.

the class TestDataSchemaResolver method testMapDataSchemaResolver.

@Test
public void testMapDataSchemaResolver() {
    boolean debug = false;
    DataSchemaResolver resolver = new MapDataSchemaResolver(SchemaParserFactory.instance(), _testPaths, _testSchemas);
    lookup(resolver, _testLookupAndExpectedResults, File.separatorChar, debug);
}
Also used : DataSchemaResolver(com.linkedin.data.schema.DataSchemaResolver) Test(org.testng.annotations.Test) BeforeTest(org.testng.annotations.BeforeTest)

Example 10 with DataSchemaResolver

use of com.linkedin.data.schema.DataSchemaResolver in project rest.li by linkedin.

the class TestDataSchemaResolver method testCircularReferences.

@Test(dataProvider = "circularReferenceData")
public void testCircularReferences(String desc, SchemaFormatType extension, Map<String, String> testSchemas, String[][] testLookupAndExpectedResults) {
    boolean debug = false;
    for (String[] testLookupAndExpectedResult : testLookupAndExpectedResults) {
        DataSchemaResolver schemaResolver = new MapDataSchemaResolver(extension.getSchemaParserFactory(), Arrays.asList(buildSystemIndependentPath("a1")), testSchemas);
        lookup(schemaResolver, new String[][] { testLookupAndExpectedResult }, File.separatorChar, debug, extension);
    }
}
Also used : DataSchemaResolver(com.linkedin.data.schema.DataSchemaResolver) Test(org.testng.annotations.Test) BeforeTest(org.testng.annotations.BeforeTest)

Aggregations

DataSchemaResolver (com.linkedin.data.schema.DataSchemaResolver)16 NamedDataSchema (com.linkedin.data.schema.NamedDataSchema)5 MultiFormatDataSchemaResolver (com.linkedin.data.schema.resolver.MultiFormatDataSchemaResolver)5 IOException (java.io.IOException)5 DataMap (com.linkedin.data.DataMap)4 FileInputStream (java.io.FileInputStream)4 RecordDataSchema (com.linkedin.data.schema.RecordDataSchema)3 PdlSchemaParser (com.linkedin.data.schema.grammar.PdlSchemaParser)3 ClasspathResourceDataSchemaResolver (com.linkedin.data.schema.resolver.ClasspathResourceDataSchemaResolver)3 ValidationOptions (com.linkedin.data.schema.validation.ValidationOptions)3 ResourceSchema (com.linkedin.restli.restspec.ResourceSchema)3 File (java.io.File)3 Map (java.util.Map)3 Test (org.testng.annotations.Test)3 AbstractSchemaParser (com.linkedin.data.schema.AbstractSchemaParser)2 DataSchema (com.linkedin.data.schema.DataSchema)2 DefaultDataSchemaResolver (com.linkedin.data.schema.resolver.DefaultDataSchemaResolver)2 ValidationResult (com.linkedin.data.schema.validation.ValidationResult)2 CompatibilityInfoMap (com.linkedin.restli.tools.compatibility.CompatibilityInfoMap)2 ResourceCompatibilityChecker (com.linkedin.restli.tools.compatibility.ResourceCompatibilityChecker)2