Search in sources :

Example 51 with CharStream

use of org.antlr.v4.runtime.CharStream in project antlr4 by tunnelvisionlabs.

the class LexerATNSimulator method execATN.

protected int execATN(@NotNull CharStream input, @NotNull DFAState ds0) {
    // System.out.println("enter exec index "+input.index()+" from "+ds0.configs);
    if (debug) {
        System.out.format(Locale.getDefault(), "start state closure=%s\n", ds0.configs);
    }
    if (ds0.isAcceptState()) {
        // allow zero-length tokens
        captureSimState(prevAccept, input, ds0);
    }
    int t = input.LA(1);
    @NotNull DFAState // s is current/from DFA state
    s = ds0;
    while (true) {
        // while more work
        if (debug) {
            System.out.format(Locale.getDefault(), "execATN loop starting closure: %s\n", s.configs);
        }
        // As we move src->trg, src->trg, we keep track of the previous trg to
        // avoid looking up the DFA state again, which is expensive.
        // If the previous target was already part of the DFA, we might
        // be able to avoid doing a reach operation upon t. If s!=null,
        // it means that semantic predicates didn't prevent us from
        // creating a DFA state. Once we know s!=null, we check to see if
        // the DFA state has an edge already for t. If so, we can just reuse
        // it's configuration set; there's no point in re-computing it.
        // This is kind of like doing DFA simulation within the ATN
        // simulation because DFA simulation is really just a way to avoid
        // computing reach/closure sets. Technically, once we know that
        // we have a previously added DFA state, we could jump over to
        // the DFA simulator. But, that would mean popping back and forth
        // a lot and making things more complicated algorithmically.
        // This optimization makes a lot of sense for loops within DFA.
        // A character will take us back to an existing DFA state
        // that already has lots of edges out of it. e.g., .* in comments.
        DFAState target = getExistingTargetState(s, t);
        if (target == null) {
            target = computeTargetState(input, s, t);
        }
        if (target == ERROR) {
            break;
        }
        // end of the token.
        if (t != IntStream.EOF) {
            consume(input);
        }
        if (target.isAcceptState()) {
            captureSimState(prevAccept, input, target);
            if (t == IntStream.EOF) {
                break;
            }
        }
        t = input.LA(1);
        // flip; current DFA target becomes new src/from state
        s = target;
    }
    return failOrAccept(prevAccept, input, s.configs, t);
}
Also used : DFAState(org.antlr.v4.runtime.dfa.DFAState) NotNull(org.antlr.v4.runtime.misc.NotNull)

Example 52 with CharStream

use of org.antlr.v4.runtime.CharStream in project antlr4 by tunnelvisionlabs.

the class LexerATNSimulator method match.

public int match(@NotNull CharStream input, int mode) {
    match_calls++;
    this.mode = mode;
    int mark = input.mark();
    try {
        this.startIndex = input.index();
        this.prevAccept.reset();
        DFAState s0 = atn.modeToDFA[mode].s0.get();
        if (s0 == null) {
            return matchATN(input);
        } else {
            return execATN(input, s0);
        }
    } finally {
        input.release(mark);
    }
}
Also used : DFAState(org.antlr.v4.runtime.dfa.DFAState)

Example 53 with CharStream

use of org.antlr.v4.runtime.CharStream in project antlr4 by tunnelvisionlabs.

the class TestPerformance method getParserFactory.

protected ParserFactory getParserFactory(String lexerName, String parserName, String listenerName, final String entryPoint) {
    try {
        ClassLoader loader = new URLClassLoader(new URL[] { new File(tmpdir).toURI().toURL() }, ClassLoader.getSystemClassLoader());
        final Class<? extends Lexer> lexerClass = loader.loadClass(lexerName).asSubclass(Lexer.class);
        final Class<? extends Parser> parserClass = loader.loadClass(parserName).asSubclass(Parser.class);
        final Class<? extends ParseTreeListener> listenerClass = (Class<? extends ParseTreeListener>) loader.loadClass(listenerName).asSubclass(ParseTreeListener.class);
        final Constructor<? extends Lexer> lexerCtor = lexerClass.getConstructor(CharStream.class);
        final Constructor<? extends Parser> parserCtor = parserClass.getConstructor(TokenStream.class);
        // construct initial instances of the lexer and parser to deserialize their ATNs
        TokenSource tokenSource = lexerCtor.newInstance(CharStreams.fromString(""));
        parserCtor.newInstance(new CommonTokenStream(tokenSource));
        if (!REUSE_LEXER_DFA) {
            Field lexerSerializedATNField = lexerClass.getField("_serializedATN");
            String lexerSerializedATN = (String) lexerSerializedATNField.get(null);
            for (int i = 0; i < NUMBER_OF_THREADS; i++) {
                sharedLexerATNs[i] = new ATNDeserializer().deserialize(lexerSerializedATN.toCharArray());
            }
        }
        if (RUN_PARSER && !REUSE_PARSER_DFA) {
            Field parserSerializedATNField = parserClass.getField("_serializedATN");
            String parserSerializedATN = (String) parserSerializedATNField.get(null);
            for (int i = 0; i < NUMBER_OF_THREADS; i++) {
                sharedParserATNs[i] = new ATNDeserializer().deserialize(parserSerializedATN.toCharArray());
            }
        }
        return new ParserFactory() {

            @SuppressWarnings("unused")
            @Override
            public FileParseResult parseFile(CharStream input, int currentPass, int thread) {
                final MurmurHashChecksum checksum = new MurmurHashChecksum();
                final long startTime = System.nanoTime();
                assert thread >= 0 && thread < NUMBER_OF_THREADS;
                try {
                    ParseTreeListener listener = sharedListeners[thread];
                    if (listener == null) {
                        listener = listenerClass.newInstance();
                        sharedListeners[thread] = listener;
                    }
                    Lexer lexer = sharedLexers[thread];
                    if (REUSE_LEXER && lexer != null) {
                        lexer.setInputStream(input);
                    } else {
                        Lexer previousLexer = lexer;
                        lexer = lexerCtor.newInstance(input);
                        sharedLexers[thread] = lexer;
                        ATN atn = (FILE_GRANULARITY || previousLexer == null ? lexer : previousLexer).getATN();
                        if (!REUSE_LEXER_DFA || (!FILE_GRANULARITY && previousLexer == null)) {
                            atn = sharedLexerATNs[thread];
                        }
                        if (!ENABLE_LEXER_DFA) {
                            lexer.setInterpreter(new NonCachingLexerATNSimulator(lexer, atn));
                        } else if (!REUSE_LEXER_DFA || COMPUTE_TRANSITION_STATS) {
                            lexer.setInterpreter(new StatisticsLexerATNSimulator(lexer, atn));
                        }
                    }
                    lexer.removeErrorListeners();
                    lexer.addErrorListener(DescriptiveLexerErrorListener.INSTANCE);
                    lexer.getInterpreter().optimize_tail_calls = OPTIMIZE_TAIL_CALLS;
                    if (ENABLE_LEXER_DFA && !REUSE_LEXER_DFA) {
                        lexer.getInterpreter().atn.clearDFA();
                    }
                    CommonTokenStream tokens = new CommonTokenStream(lexer);
                    tokens.fill();
                    tokenCount.addAndGet(currentPass, tokens.size());
                    if (COMPUTE_CHECKSUM) {
                        for (Token token : tokens.getTokens()) {
                            updateChecksum(checksum, token);
                        }
                    }
                    if (!RUN_PARSER) {
                        return new FileParseResult(input.getSourceName(), checksum.getValue(), null, tokens.size(), startTime, lexer, null);
                    }
                    final long parseStartTime = System.nanoTime();
                    Parser parser = sharedParsers[thread];
                    if (REUSE_PARSER && parser != null) {
                        parser.setInputStream(tokens);
                    } else {
                        Parser previousParser = parser;
                        if (USE_PARSER_INTERPRETER) {
                            Parser referenceParser = parserCtor.newInstance(tokens);
                            parser = new ParserInterpreter(referenceParser.getGrammarFileName(), referenceParser.getVocabulary(), Arrays.asList(referenceParser.getRuleNames()), referenceParser.getATN(), tokens);
                        } else {
                            parser = parserCtor.newInstance(tokens);
                        }
                        ATN atn = (FILE_GRANULARITY || previousParser == null ? parser : previousParser).getATN();
                        if (!REUSE_PARSER_DFA || (!FILE_GRANULARITY && previousParser == null)) {
                            atn = sharedLexerATNs[thread];
                        }
                        if (!ENABLE_PARSER_DFA) {
                            parser.setInterpreter(new NonCachingParserATNSimulator(parser, atn));
                        } else if (!REUSE_PARSER_DFA || COMPUTE_TRANSITION_STATS) {
                            parser.setInterpreter(new StatisticsParserATNSimulator(parser, atn));
                        }
                        sharedParsers[thread] = parser;
                    }
                    parser.removeParseListeners();
                    parser.removeErrorListeners();
                    if (!TWO_STAGE_PARSING) {
                        parser.addErrorListener(DescriptiveErrorListener.INSTANCE);
                        parser.addErrorListener(new SummarizingDiagnosticErrorListener());
                    }
                    if (ENABLE_PARSER_DFA && !REUSE_PARSER_DFA) {
                        parser.getInterpreter().atn.clearDFA();
                    }
                    parser.getInterpreter().setPredictionMode(TWO_STAGE_PARSING ? PredictionMode.SLL : PREDICTION_MODE);
                    parser.getInterpreter().force_global_context = FORCE_GLOBAL_CONTEXT && !TWO_STAGE_PARSING;
                    parser.getInterpreter().always_try_local_context = TRY_LOCAL_CONTEXT_FIRST || TWO_STAGE_PARSING;
                    parser.getInterpreter().enable_global_context_dfa = ENABLE_PARSER_FULL_CONTEXT_DFA;
                    parser.getInterpreter().optimize_ll1 = OPTIMIZE_LL1;
                    parser.getInterpreter().optimize_unique_closure = OPTIMIZE_UNIQUE_CLOSURE;
                    parser.getInterpreter().optimize_tail_calls = OPTIMIZE_TAIL_CALLS;
                    parser.getInterpreter().tail_call_preserves_sll = TAIL_CALL_PRESERVES_SLL;
                    parser.getInterpreter().treat_sllk1_conflict_as_ambiguity = TREAT_SLLK1_CONFLICT_AS_AMBIGUITY;
                    parser.setBuildParseTree(BUILD_PARSE_TREES);
                    if (!BUILD_PARSE_TREES && BLANK_LISTENER) {
                        parser.addParseListener(listener);
                    }
                    if (BAIL_ON_ERROR || TWO_STAGE_PARSING) {
                        parser.setErrorHandler(new BailErrorStrategy());
                    }
                    Method parseMethod = parserClass.getMethod(entryPoint);
                    Object parseResult;
                    try {
                        if (COMPUTE_CHECKSUM && !BUILD_PARSE_TREES) {
                            parser.addParseListener(new ChecksumParseTreeListener(checksum));
                        }
                        if (USE_PARSER_INTERPRETER) {
                            ParserInterpreter parserInterpreter = (ParserInterpreter) parser;
                            parseResult = parserInterpreter.parse(Collections.lastIndexOfSubList(Arrays.asList(parser.getRuleNames()), Collections.singletonList(entryPoint)));
                        } else {
                            parseResult = parseMethod.invoke(parser);
                        }
                    } catch (InvocationTargetException ex) {
                        if (!TWO_STAGE_PARSING) {
                            throw ex;
                        }
                        String sourceName = tokens.getSourceName();
                        sourceName = sourceName != null && !sourceName.isEmpty() ? sourceName + ": " : "";
                        if (REPORT_SECOND_STAGE_RETRY) {
                            System.err.println(sourceName + "Forced to retry with full context.");
                        }
                        if (!(ex.getCause() instanceof ParseCancellationException)) {
                            throw ex;
                        }
                        tokens.seek(0);
                        if (REUSE_PARSER && sharedParsers[thread] != null) {
                            parser.setInputStream(tokens);
                        } else {
                            if (USE_PARSER_INTERPRETER) {
                                Parser referenceParser = parserCtor.newInstance(tokens);
                                parser = new ParserInterpreter(referenceParser.getGrammarFileName(), referenceParser.getVocabulary(), Arrays.asList(referenceParser.getRuleNames()), referenceParser.getATN(), tokens);
                            } else {
                                parser = parserCtor.newInstance(tokens);
                            }
                            sharedParsers[thread] = parser;
                        }
                        parser.removeParseListeners();
                        parser.removeErrorListeners();
                        parser.addErrorListener(DescriptiveErrorListener.INSTANCE);
                        parser.addErrorListener(new SummarizingDiagnosticErrorListener());
                        if (!ENABLE_PARSER_DFA) {
                            parser.setInterpreter(new NonCachingParserATNSimulator(parser, parser.getATN()));
                        } else if (!REUSE_PARSER_DFA) {
                            parser.setInterpreter(new StatisticsParserATNSimulator(parser, sharedParserATNs[thread]));
                        } else if (COMPUTE_TRANSITION_STATS) {
                            parser.setInterpreter(new StatisticsParserATNSimulator(parser, parser.getATN()));
                        }
                        parser.getInterpreter().setPredictionMode(PREDICTION_MODE);
                        parser.getInterpreter().force_global_context = FORCE_GLOBAL_CONTEXT;
                        parser.getInterpreter().always_try_local_context = TRY_LOCAL_CONTEXT_FIRST;
                        parser.getInterpreter().enable_global_context_dfa = ENABLE_PARSER_FULL_CONTEXT_DFA;
                        parser.getInterpreter().optimize_ll1 = OPTIMIZE_LL1;
                        parser.getInterpreter().optimize_unique_closure = OPTIMIZE_UNIQUE_CLOSURE;
                        parser.getInterpreter().optimize_tail_calls = OPTIMIZE_TAIL_CALLS;
                        parser.getInterpreter().tail_call_preserves_sll = TAIL_CALL_PRESERVES_SLL;
                        parser.getInterpreter().treat_sllk1_conflict_as_ambiguity = TREAT_SLLK1_CONFLICT_AS_AMBIGUITY;
                        parser.setBuildParseTree(BUILD_PARSE_TREES);
                        if (COMPUTE_CHECKSUM && !BUILD_PARSE_TREES) {
                            parser.addParseListener(new ChecksumParseTreeListener(checksum));
                        }
                        if (!BUILD_PARSE_TREES && BLANK_LISTENER) {
                            parser.addParseListener(listener);
                        }
                        if (BAIL_ON_ERROR) {
                            parser.setErrorHandler(new BailErrorStrategy());
                        }
                        parseResult = parseMethod.invoke(parser);
                    }
                    assertThat(parseResult, instanceOf(ParseTree.class));
                    if (COMPUTE_CHECKSUM && BUILD_PARSE_TREES) {
                        ParseTreeWalker.DEFAULT.walk(new ChecksumParseTreeListener(checksum), (ParseTree) parseResult);
                    }
                    if (BUILD_PARSE_TREES && BLANK_LISTENER) {
                        ParseTreeWalker.DEFAULT.walk(listener, (ParserRuleContext) parseResult);
                    }
                    return new FileParseResult(input.getSourceName(), checksum.getValue(), (ParseTree) parseResult, tokens.size(), TIME_PARSE_ONLY ? parseStartTime : startTime, lexer, parser);
                } catch (Exception e) {
                    if (!REPORT_SYNTAX_ERRORS && e instanceof ParseCancellationException) {
                        return new FileParseResult("unknown", checksum.getValue(), null, 0, startTime, null, null);
                    }
                    e.printStackTrace(System.out);
                    throw new IllegalStateException(e);
                }
            }
        };
    } catch (Exception e) {
        e.printStackTrace(System.out);
        Assert.fail(e.getMessage());
        throw new IllegalStateException(e);
    }
}
Also used : ATNDeserializer(org.antlr.v4.runtime.atn.ATNDeserializer) ParserInterpreter(org.antlr.v4.runtime.ParserInterpreter) Token(org.antlr.v4.runtime.Token) CodePointCharStream(org.antlr.v4.runtime.CodePointCharStream) CharStream(org.antlr.v4.runtime.CharStream) Field(java.lang.reflect.Field) URLClassLoader(java.net.URLClassLoader) CommonTokenStream(org.antlr.v4.runtime.CommonTokenStream) TokenSource(org.antlr.v4.runtime.TokenSource) BailErrorStrategy(org.antlr.v4.runtime.BailErrorStrategy) Method(java.lang.reflect.Method) InvocationTargetException(java.lang.reflect.InvocationTargetException) InvocationTargetException(java.lang.reflect.InvocationTargetException) ParseCancellationException(org.antlr.v4.runtime.misc.ParseCancellationException) IOException(java.io.IOException) ExecutionException(java.util.concurrent.ExecutionException) RecognitionException(org.antlr.v4.runtime.RecognitionException) Parser(org.antlr.v4.runtime.Parser) ParseTreeListener(org.antlr.v4.runtime.tree.ParseTreeListener) Lexer(org.antlr.v4.runtime.Lexer) ParseCancellationException(org.antlr.v4.runtime.misc.ParseCancellationException) URLClassLoader(java.net.URLClassLoader) ATN(org.antlr.v4.runtime.atn.ATN) File(java.io.File) ParseTree(org.antlr.v4.runtime.tree.ParseTree)

Example 54 with CharStream

use of org.antlr.v4.runtime.CharStream in project antlr4 by tunnelvisionlabs.

the class TestPerformance method parseSources.

@SuppressWarnings("unused")
protected void parseSources(final int currentPass, final ParserFactory factory, Collection<InputDescriptor> sources, boolean shuffleSources) throws InterruptedException {
    if (shuffleSources) {
        List<InputDescriptor> sourcesList = new ArrayList<InputDescriptor>(sources);
        synchronized (RANDOM) {
            Collections.shuffle(sourcesList, RANDOM);
        }
        sources = sourcesList;
    }
    long startTime = System.nanoTime();
    tokenCount.set(currentPass, 0);
    int inputSize = 0;
    int inputCount = 0;
    Collection<Future<FileParseResult>> results = new ArrayList<Future<FileParseResult>>();
    ExecutorService executorService;
    if (FILE_GRANULARITY) {
        executorService = Executors.newFixedThreadPool(FILE_GRANULARITY ? NUMBER_OF_THREADS : 1, new NumberedThreadFactory());
    } else {
        executorService = Executors.newSingleThreadExecutor(new FixedThreadNumberFactory(((NumberedThread) Thread.currentThread()).getThreadNumber()));
    }
    for (InputDescriptor inputDescriptor : sources) {
        if (inputCount >= MAX_FILES_PER_PARSE_ITERATION) {
            break;
        }
        final CharStream input = inputDescriptor.getInputStream();
        input.seek(0);
        inputSize += input.size();
        inputCount++;
        Future<FileParseResult> futureChecksum = executorService.submit(new Callable<FileParseResult>() {

            @Override
            public FileParseResult call() {
                // System.out.format("Parsing file %s\n", input.getSourceName());
                try {
                    return factory.parseFile(input, currentPass, ((NumberedThread) Thread.currentThread()).getThreadNumber());
                } catch (IllegalStateException ex) {
                    ex.printStackTrace(System.err);
                } catch (Throwable t) {
                    t.printStackTrace(System.err);
                }
                return null;
            }
        });
        results.add(futureChecksum);
    }
    MurmurHashChecksum checksum = new MurmurHashChecksum();
    int currentIndex = -1;
    for (Future<FileParseResult> future : results) {
        currentIndex++;
        int fileChecksum = 0;
        try {
            FileParseResult fileResult = future.get();
            if (COMPUTE_TRANSITION_STATS) {
                totalTransitionsPerFile[currentPass][currentIndex] = sum(fileResult.parserTotalTransitions);
                computedTransitionsPerFile[currentPass][currentIndex] = sum(fileResult.parserComputedTransitions);
                if (DETAILED_DFA_STATE_STATS) {
                    decisionInvocationsPerFile[currentPass][currentIndex] = fileResult.decisionInvocations;
                    fullContextFallbackPerFile[currentPass][currentIndex] = fileResult.fullContextFallback;
                    nonSllPerFile[currentPass][currentIndex] = fileResult.nonSll;
                    totalTransitionsPerDecisionPerFile[currentPass][currentIndex] = fileResult.parserTotalTransitions;
                    computedTransitionsPerDecisionPerFile[currentPass][currentIndex] = fileResult.parserComputedTransitions;
                    fullContextTransitionsPerDecisionPerFile[currentPass][currentIndex] = fileResult.parserFullContextTransitions;
                }
            }
            if (COMPUTE_TIMING_STATS) {
                timePerFile[currentPass][currentIndex] = fileResult.endTime - fileResult.startTime;
                tokensPerFile[currentPass][currentIndex] = fileResult.tokenCount;
            }
            fileChecksum = fileResult.checksum;
        } catch (ExecutionException ex) {
            Logger.getLogger(TestPerformance.class.getName()).log(Level.SEVERE, null, ex);
        }
        if (COMPUTE_CHECKSUM) {
            updateChecksum(checksum, fileChecksum);
        }
    }
    executorService.shutdown();
    executorService.awaitTermination(Long.MAX_VALUE, TimeUnit.NANOSECONDS);
    System.out.format("%d. Total parse time for %d files (%d KB, %d tokens%s): %.0fms%n", currentPass + 1, inputCount, inputSize / 1024, tokenCount.get(currentPass), COMPUTE_CHECKSUM ? String.format(", checksum 0x%8X", checksum.getValue()) : "", (double) (System.nanoTime() - startTime) / 1000000.0);
    if (sharedLexers.length > 0) {
        int index = FILE_GRANULARITY ? 0 : ((NumberedThread) Thread.currentThread()).getThreadNumber();
        Lexer lexer = sharedLexers[index];
        final LexerATNSimulator lexerInterpreter = lexer.getInterpreter();
        final DFA[] modeToDFA = lexerInterpreter.atn.modeToDFA;
        if (SHOW_DFA_STATE_STATS) {
            int states = 0;
            int configs = 0;
            Set<ATNConfig> uniqueConfigs = new HashSet<ATNConfig>();
            for (int i = 0; i < modeToDFA.length; i++) {
                DFA dfa = modeToDFA[i];
                if (dfa == null) {
                    continue;
                }
                states += dfa.states.size();
                for (DFAState state : dfa.states.values()) {
                    configs += state.configs.size();
                    uniqueConfigs.addAll(state.configs);
                }
            }
            System.out.format("There are %d lexer DFAState instances, %d configs (%d unique), %d prediction contexts.%n", states, configs, uniqueConfigs.size(), lexerInterpreter.atn.getContextCacheSize());
            if (DETAILED_DFA_STATE_STATS) {
                System.out.format("\tMode\tStates\tConfigs\tMode%n");
                for (int i = 0; i < modeToDFA.length; i++) {
                    DFA dfa = modeToDFA[i];
                    if (dfa == null || dfa.states.isEmpty()) {
                        continue;
                    }
                    int modeConfigs = 0;
                    for (DFAState state : dfa.states.values()) {
                        modeConfigs += state.configs.size();
                    }
                    String modeName = lexer.getModeNames()[i];
                    System.out.format("\t%d\t%d\t%d\t%s%n", dfa.decision, dfa.states.size(), modeConfigs, modeName);
                }
            }
        }
    }
    if (RUN_PARSER && sharedParsers.length > 0) {
        int index = FILE_GRANULARITY ? 0 : ((NumberedThread) Thread.currentThread()).getThreadNumber();
        Parser parser = sharedParsers[index];
        // make sure the individual DFAState objects actually have unique ATNConfig arrays
        final ParserATNSimulator interpreter = parser.getInterpreter();
        final DFA[] decisionToDFA = interpreter.atn.decisionToDFA;
        if (SHOW_DFA_STATE_STATS) {
            int states = 0;
            int configs = 0;
            Set<ATNConfig> uniqueConfigs = new HashSet<ATNConfig>();
            for (int i = 0; i < decisionToDFA.length; i++) {
                DFA dfa = decisionToDFA[i];
                if (dfa == null) {
                    continue;
                }
                states += dfa.states.size();
                for (DFAState state : dfa.states.values()) {
                    configs += state.configs.size();
                    uniqueConfigs.addAll(state.configs);
                }
            }
            System.out.format("There are %d parser DFAState instances, %d configs (%d unique), %d prediction contexts.%n", states, configs, uniqueConfigs.size(), interpreter.atn.getContextCacheSize());
            if (DETAILED_DFA_STATE_STATS) {
                if (COMPUTE_TRANSITION_STATS) {
                    System.out.format("\tDecision\tStates\tConfigs\tPredict (ALL)\tPredict (LL)\tNon-SLL\tTransitions\tTransitions (ATN)\tTransitions (LL)\tLA (SLL)\tLA (LL)\tRule%n");
                } else {
                    System.out.format("\tDecision\tStates\tConfigs\tRule%n");
                }
                for (int i = 0; i < decisionToDFA.length; i++) {
                    DFA dfa = decisionToDFA[i];
                    if (dfa == null || dfa.states.isEmpty()) {
                        continue;
                    }
                    int decisionConfigs = 0;
                    for (DFAState state : dfa.states.values()) {
                        decisionConfigs += state.configs.size();
                    }
                    String ruleName = parser.getRuleNames()[parser.getATN().decisionToState.get(dfa.decision).ruleIndex];
                    long calls = 0;
                    long fullContextCalls = 0;
                    long nonSllCalls = 0;
                    long transitions = 0;
                    long computedTransitions = 0;
                    long fullContextTransitions = 0;
                    double lookahead = 0;
                    double fullContextLookahead = 0;
                    String formatString;
                    if (COMPUTE_TRANSITION_STATS) {
                        for (long[] data : decisionInvocationsPerFile[currentPass]) {
                            calls += data[i];
                        }
                        for (long[] data : fullContextFallbackPerFile[currentPass]) {
                            fullContextCalls += data[i];
                        }
                        for (long[] data : nonSllPerFile[currentPass]) {
                            nonSllCalls += data[i];
                        }
                        for (long[] data : totalTransitionsPerDecisionPerFile[currentPass]) {
                            transitions += data[i];
                        }
                        for (long[] data : computedTransitionsPerDecisionPerFile[currentPass]) {
                            computedTransitions += data[i];
                        }
                        for (long[] data : fullContextTransitionsPerDecisionPerFile[currentPass]) {
                            fullContextTransitions += data[i];
                        }
                        if (calls > 0) {
                            lookahead = (double) (transitions - fullContextTransitions) / (double) calls;
                        }
                        if (fullContextCalls > 0) {
                            fullContextLookahead = (double) fullContextTransitions / (double) fullContextCalls;
                        }
                        formatString = "\t%1$d\t%2$d\t%3$d\t%4$d\t%5$d\t%6$d\t%7$d\t%8$d\t%9$d\t%10$f\t%11$f\t%12$s%n";
                    } else {
                        calls = 0;
                        formatString = "\t%1$d\t%2$d\t%3$d\t%12$s%n";
                    }
                    System.out.format(formatString, dfa.decision, dfa.states.size(), decisionConfigs, calls, fullContextCalls, nonSllCalls, transitions, computedTransitions, fullContextTransitions, lookahead, fullContextLookahead, ruleName);
                }
            }
        }
        int localDfaCount = 0;
        int globalDfaCount = 0;
        int localConfigCount = 0;
        int globalConfigCount = 0;
        int[] contextsInDFAState = new int[0];
        for (int i = 0; i < decisionToDFA.length; i++) {
            DFA dfa = decisionToDFA[i];
            if (dfa == null) {
                continue;
            }
            if (SHOW_CONFIG_STATS) {
                for (DFAState state : dfa.states.keySet()) {
                    if (state.configs.size() >= contextsInDFAState.length) {
                        contextsInDFAState = Arrays.copyOf(contextsInDFAState, state.configs.size() + 1);
                    }
                    if (state.isAcceptState()) {
                        boolean hasGlobal = false;
                        for (ATNConfig config : state.configs) {
                            if (config.getReachesIntoOuterContext()) {
                                globalConfigCount++;
                                hasGlobal = true;
                            } else {
                                localConfigCount++;
                            }
                        }
                        if (hasGlobal) {
                            globalDfaCount++;
                        } else {
                            localDfaCount++;
                        }
                    }
                    contextsInDFAState[state.configs.size()]++;
                }
            }
            if (EXPORT_LARGEST_CONFIG_CONTEXTS) {
                for (DFAState state : dfa.states.keySet()) {
                    for (ATNConfig config : state.configs) {
                        String configOutput = config.toDotString();
                        if (configOutput.length() <= configOutputSize) {
                            continue;
                        }
                        configOutputSize = configOutput.length();
                        writeFile(tmpdir, "d" + dfa.decision + ".s" + state.stateNumber + ".a" + config.getAlt() + ".config.dot", configOutput);
                    }
                }
            }
        }
        if (SHOW_CONFIG_STATS && currentPass == 0) {
            System.out.format("  DFA accept states: %d total, %d with only local context, %d with a global context%n", localDfaCount + globalDfaCount, localDfaCount, globalDfaCount);
            System.out.format("  Config stats: %d total, %d local, %d global%n", localConfigCount + globalConfigCount, localConfigCount, globalConfigCount);
            if (SHOW_DFA_STATE_STATS) {
                for (int i = 0; i < contextsInDFAState.length; i++) {
                    if (contextsInDFAState[i] != 0) {
                        System.out.format("  %d configs = %d%n", i, contextsInDFAState[i]);
                    }
                }
            }
        }
    }
    if (COMPUTE_TIMING_STATS) {
        System.out.format("File\tTokens\tTime%n");
        for (int i = 0; i < timePerFile[currentPass].length; i++) {
            System.out.format("%d\t%d\t%d%n", i + 1, tokensPerFile[currentPass][i], timePerFile[currentPass][i]);
        }
    }
}
Also used : ArrayList(java.util.ArrayList) CodePointCharStream(org.antlr.v4.runtime.CodePointCharStream) CharStream(org.antlr.v4.runtime.CharStream) ATNConfig(org.antlr.v4.runtime.atn.ATNConfig) ExecutionException(java.util.concurrent.ExecutionException) HashSet(java.util.HashSet) DFAState(org.antlr.v4.runtime.dfa.DFAState) Parser(org.antlr.v4.runtime.Parser) Lexer(org.antlr.v4.runtime.Lexer) ExecutorService(java.util.concurrent.ExecutorService) LexerATNSimulator(org.antlr.v4.runtime.atn.LexerATNSimulator) Future(java.util.concurrent.Future) ParserATNSimulator(org.antlr.v4.runtime.atn.ParserATNSimulator) DFA(org.antlr.v4.runtime.dfa.DFA)

Example 55 with CharStream

use of org.antlr.v4.runtime.CharStream in project antlr4 by tunnelvisionlabs.

the class TestUnbufferedCharStream method testNegativeSeek.

@Test(expected = IllegalArgumentException.class)
public void testNegativeSeek() {
    CharStream input = createStream("");
    input.seek(-1);
}
Also used : CharStream(org.antlr.v4.runtime.CharStream) UnbufferedCharStream(org.antlr.v4.runtime.UnbufferedCharStream) Test(org.junit.Test)

Aggregations

CharStream (org.antlr.v4.runtime.CharStream)187 Test (org.junit.Test)93 CommonTokenStream (org.antlr.v4.runtime.CommonTokenStream)85 UnbufferedCharStream (org.antlr.v4.runtime.UnbufferedCharStream)46 ParseTree (org.antlr.v4.runtime.tree.ParseTree)38 ParseTreeWalker (org.antlr.v4.runtime.tree.ParseTreeWalker)30 IOException (java.io.IOException)28 InputStream (java.io.InputStream)23 ANTLRInputStream (org.antlr.v4.runtime.ANTLRInputStream)23 LexerInterpreter (org.antlr.v4.runtime.LexerInterpreter)22 File (java.io.File)21 TokenStream (org.antlr.v4.runtime.TokenStream)21 StringReader (java.io.StringReader)20 CancellationException (java.util.concurrent.CancellationException)20 ConsoleErrorListener (org.antlr.v4.runtime.ConsoleErrorListener)20 CommonTokenFactory (org.antlr.v4.runtime.CommonTokenFactory)18 Token (org.antlr.v4.runtime.Token)18 LexerGrammar (org.antlr.v4.tool.LexerGrammar)18 Utils.toCharStream (clawfc.Utils.toCharStream)15 Path (java.nio.file.Path)14