Search in sources :

Example 1 with Tokenizer

use of net.morimekta.providence.serializer.pretty.Tokenizer in project providence by morimekta.

the class ProvidenceConfigParser method parseConfigRecursively.

@SuppressWarnings("unchecked")
<M extends PMessage<M, F>, F extends PField> Pair<M, Set<String>> parseConfigRecursively(@Nonnull Path file, M parent, String[] stack) throws IOException {
    Tokenizer tokenizer;
    try (BufferedInputStream in = new BufferedInputStream(new FileInputStream(file.toFile()))) {
        // Non-enclosed content, meaning we should read the whole file immediately.
        tokenizer = new Tokenizer(new Utf8StreamReader(in), Tokenizer.DEFAULT_BUFFER_SIZE, true);
    }
    ProvidenceConfigContext context = new ProvidenceConfigContext();
    Set<String> includedFilePaths = new TreeSet<>();
    includedFilePaths.add(canonicalFileLocation(file).toString());
    Stage lastStage = Stage.INCLUDES;
    M result = null;
    Token token = tokenizer.peek();
    while (token != null) {
        tokenizer.next();
        if (lastStage == Stage.MESSAGE) {
            throw new TokenizerException(token, "Unexpected token '" + token.asString() + "', expected end of file.").setLine(tokenizer.getLine());
        } else if (INCLUDE.equals(token.asString())) {
            // if include && stage == INCLUDES --> INCLUDES
            if (lastStage != Stage.INCLUDES) {
                throw new TokenizerException(token, "Include added after defines or message. Only one def block allowed.").setLine(tokenizer.getLine());
            }
            token = tokenizer.expectLiteral("file to be included");
            String includedFilePath = token.decodeLiteral(strict);
            PMessage included;
            Path includedFile;
            try {
                includedFile = resolveFile(file, includedFilePath);
                Pair<PMessage, Set<String>> tmp = checkAndParseInternal(includedFile, null, stack);
                if (tmp != null) {
                    includedFilePaths.add(includedFile.toString());
                    includedFilePaths.addAll(tmp.second);
                    included = tmp.first;
                } else {
                    included = null;
                }
            } catch (FileNotFoundException e) {
                throw new TokenizerException(token, "Included file \"%s\" not found.", includedFilePath).setLine(tokenizer.getLine());
            }
            token = tokenizer.expectIdentifier("the token 'as'");
            if (!AS.equals(token.asString())) {
                throw new TokenizerException(token, "Expected token 'as' after included file \"%s\".", includedFilePath).setLine(tokenizer.getLine());
            }
            token = tokenizer.expectIdentifier("Include alias");
            String alias = token.asString();
            if (RESERVED_WORDS.contains(alias)) {
                throw new TokenizerException(token, "Alias \"%s\" is a reserved word.", alias).setLine(tokenizer.getLine());
            }
            if (context.containsReference(alias)) {
                throw new TokenizerException(token, "Alias \"%s\" is already used.", alias).setLine(tokenizer.getLine());
            }
            context.setInclude(alias, included);
        } else if (DEF.equals(token.asString())) {
            // if params && stage == DEF --> DEF
            lastStage = Stage.DEFINES;
            parseDefinitions(context, tokenizer);
        } else if (token.isQualifiedIdentifier()) {
            // if a.b (type identifier) --> MESSAGE
            lastStage = Stage.MESSAGE;
            PMessageDescriptor<M, F> descriptor;
            try {
                descriptor = (PMessageDescriptor) registry.getDeclaredType(token.asString());
            } catch (IllegalArgumentException e) {
                // even in non-strict mode.
                if (strict || stack.length == 1) {
                    throw new TokenizerException(token, "Unknown declared type: %s", token.asString()).setLine(tokenizer.getLine());
                }
                return null;
            }
            result = parseConfigMessage(tokenizer, context, descriptor.builder(), parent, file);
        } else {
            throw new TokenizerException(token, "Unexpected token '" + token.asString() + "'. Expected include, defines or message type").setLine(tokenizer.getLine());
        }
        token = tokenizer.peek();
    }
    if (result == null) {
        throw new TokenizerException("No message in config: " + file.getFileName().toString());
    }
    return Pair.create(result, includedFilePaths);
}
Also used : Path(java.nio.file.Path) ProvidenceConfigUtil.readCanonicalPath(net.morimekta.providence.config.impl.ProvidenceConfigUtil.readCanonicalPath) Utf8StreamReader(net.morimekta.util.io.Utf8StreamReader) FileNotFoundException(java.io.FileNotFoundException) Token(net.morimekta.providence.serializer.pretty.Token) TokenizerException(net.morimekta.providence.serializer.pretty.TokenizerException) FileInputStream(java.io.FileInputStream) BufferedInputStream(java.io.BufferedInputStream) TreeSet(java.util.TreeSet) PMessage(net.morimekta.providence.PMessage) Stage(net.morimekta.providence.config.impl.ProvidenceConfigUtil.Stage) PMessageDescriptor(net.morimekta.providence.descriptor.PMessageDescriptor) Tokenizer(net.morimekta.providence.serializer.pretty.Tokenizer) Pair(net.morimekta.util.Pair)

Example 2 with Tokenizer

use of net.morimekta.providence.serializer.pretty.Tokenizer in project providence by morimekta.

the class PrettySerializer method deserialize.

@Nonnull
@Override
public <Message extends PMessage<Message, Field>, Field extends PField> Message deserialize(@Nonnull InputStream input, @Nonnull PMessageDescriptor<Message, Field> descriptor) throws IOException {
    Tokenizer tokenizer = new Tokenizer(input);
    Token first = tokenizer.peek("start of message");
    if (first.isSymbol(Token.kMessageStart)) {
        tokenizer.next();
    } else if (first.isQualifiedIdentifier()) {
        if (first.asString().equals(descriptor.getQualifiedName())) {
            // skip the name
            tokenizer.next();
            tokenizer.expectSymbol("message start after qualifier", Token.kMessageStart);
        } else {
            throw tokenizer.failure(first, "Expected qualifier " + descriptor.getQualifiedName() + " or message start, Got '" + first.asString() + "'");
        }
    } else {
        throw tokenizer.failure(first, "Expected message start or qualifier, Got '" + first.asString() + "'");
    }
    return readMessage(tokenizer, descriptor, false);
}
Also used : Token(net.morimekta.providence.serializer.pretty.Token) Tokenizer(net.morimekta.providence.serializer.pretty.Tokenizer) Nonnull(javax.annotation.Nonnull)

Example 3 with Tokenizer

use of net.morimekta.providence.serializer.pretty.Tokenizer in project providence by morimekta.

the class PrettySerializer method deserialize.

@Nonnull
@Override
@SuppressWarnings("unchecked")
public <Message extends PMessage<Message, Field>, Field extends PField> PServiceCall<Message, Field> deserialize(@Nonnull InputStream input, @Nonnull PService service) throws IOException {
    String methodName = null;
    int sequence = 0;
    PServiceCallType callType = null;
    try {
        // pretty printed service calls cannot be chained-serialized, so this should be totally safe.
        Tokenizer tokenizer = new Tokenizer(input);
        Token token = tokenizer.expect("Sequence or type");
        if (token.isInteger()) {
            sequence = (int) token.parseInteger();
            tokenizer.expectSymbol("Sequence type sep", Token.kKeyValueSep);
            token = tokenizer.expectIdentifier("Call Type");
        }
        callType = PServiceCallType.findByName(token.asString().toUpperCase(Locale.US));
        if (callType == null) {
            throw new TokenizerException(token, "No such call type %s", token.asString()).setLine(tokenizer.getLine()).setExceptionType(PApplicationExceptionType.INVALID_MESSAGE_TYPE);
        }
        token = tokenizer.expectIdentifier("method name");
        methodName = token.asString();
        PServiceMethod method = service.getMethod(methodName);
        if (method == null) {
            throw new TokenizerException(token, "no such method %s on service %s", methodName, service.getQualifiedName()).setLine(tokenizer.getLine()).setExceptionType(PApplicationExceptionType.UNKNOWN_METHOD);
        }
        tokenizer.expectSymbol("call params start", Token.kParamsStart);
        Message message;
        switch(callType) {
            case CALL:
            case ONEWAY:
                message = (Message) readMessage(tokenizer, method.getRequestType(), true);
                break;
            case REPLY:
                message = (Message) readMessage(tokenizer, method.getResponseType(), true);
                break;
            case EXCEPTION:
                message = (Message) readMessage(tokenizer, PApplicationException.kDescriptor, true);
                break;
            default:
                throw new IllegalStateException("Unreachable code reached");
        }
        return new PServiceCall<>(methodName, callType, sequence, message);
    } catch (TokenizerException e) {
        e.setCallType(callType).setSequenceNo(sequence).setMethodName(methodName);
        throw e;
    } catch (IOException e) {
        throw new SerializerException(e, e.getMessage()).setCallType(callType).setSequenceNo(sequence).setMethodName(methodName);
    }
}
Also used : PMessage(net.morimekta.providence.PMessage) PServiceCallType(net.morimekta.providence.PServiceCallType) PServiceCall(net.morimekta.providence.PServiceCall) Token(net.morimekta.providence.serializer.pretty.Token) TokenizerException(net.morimekta.providence.serializer.pretty.TokenizerException) IOException(java.io.IOException) Tokenizer(net.morimekta.providence.serializer.pretty.Tokenizer) PServiceMethod(net.morimekta.providence.descriptor.PServiceMethod) Nonnull(javax.annotation.Nonnull)

Example 4 with Tokenizer

use of net.morimekta.providence.serializer.pretty.Tokenizer in project providence by morimekta.

the class ProvidenceConfigUtilTest method validateConsume.

private void validateConsume(String content, ProvidenceConfigContext context) throws IOException {
    if (context == null) {
        context = new ProvidenceConfigContext();
    }
    Tokenizer tokenizer = tokenizer(content);
    consumeValue(context, tokenizer, tokenizer.expect("consumed value"));
    Token next = tokenizer.expect("after consuming");
    assertThat(next, is(notNullValue()));
    assertThat(next.asString(), is("__after__"));
}
Also used : Token(net.morimekta.providence.serializer.pretty.Token) Tokenizer(net.morimekta.providence.serializer.pretty.Tokenizer)

Aggregations

Token (net.morimekta.providence.serializer.pretty.Token)4 Tokenizer (net.morimekta.providence.serializer.pretty.Tokenizer)4 Nonnull (javax.annotation.Nonnull)2 PMessage (net.morimekta.providence.PMessage)2 TokenizerException (net.morimekta.providence.serializer.pretty.TokenizerException)2 BufferedInputStream (java.io.BufferedInputStream)1 FileInputStream (java.io.FileInputStream)1 FileNotFoundException (java.io.FileNotFoundException)1 IOException (java.io.IOException)1 Path (java.nio.file.Path)1 TreeSet (java.util.TreeSet)1 PServiceCall (net.morimekta.providence.PServiceCall)1 PServiceCallType (net.morimekta.providence.PServiceCallType)1 Stage (net.morimekta.providence.config.impl.ProvidenceConfigUtil.Stage)1 ProvidenceConfigUtil.readCanonicalPath (net.morimekta.providence.config.impl.ProvidenceConfigUtil.readCanonicalPath)1 PMessageDescriptor (net.morimekta.providence.descriptor.PMessageDescriptor)1 PServiceMethod (net.morimekta.providence.descriptor.PServiceMethod)1 Pair (net.morimekta.util.Pair)1 Utf8StreamReader (net.morimekta.util.io.Utf8StreamReader)1