Search in sources :

Example 1 with MessageCollection

use of org.graylog2.plugin.MessageCollection in project graylog2-server by Graylog2.

the class PipelineInterpreter method process.

/**
 * Evaluates all pipelines that apply to the given messages, based on the current stream routing
 * of the messages.
 *
 * The processing loops on each single message (passed in or created by pipelines) until the set
 * of streams does not change anymore. No cycle detection is performed.
 *
 * @param messages            the messages to process through the pipelines
 * @param interpreterListener a listener which gets called for each processing stage (e.g. to
 *                            trace execution)
 * @param state               the pipeline/stage/rule/stream connection state to use during
 *                            processing
 * @return the processed messages
 */
public Messages process(Messages messages, InterpreterListener interpreterListener, State state) {
    interpreterListener.startProcessing();
    // message id + stream id
    final Set<Tuple2<String, String>> processingBlacklist = Sets.newHashSet();
    final List<Message> toProcess = Lists.newArrayList(messages);
    final List<Message> fullyProcessed = Lists.newArrayListWithExpectedSize(toProcess.size());
    while (!toProcess.isEmpty()) {
        final MessageCollection currentSet = new MessageCollection(toProcess);
        // we'll add them back below
        toProcess.clear();
        for (Message message : currentSet) {
            final String msgId = message.getId();
            // this makes a copy of the list, which is mutated later in updateStreamBlacklist
            // it serves as a worklist, to keep track of which <msg, stream> tuples need to be re-run again
            final Set<String> initialStreamIds = message.getStreams().stream().map(Stream::getId).collect(Collectors.toSet());
            final ImmutableSet<Pipeline> pipelinesToRun = selectPipelines(interpreterListener, processingBlacklist, message, initialStreamIds, state.getStreamPipelineConnections());
            toProcess.addAll(processForResolvedPipelines(message, msgId, pipelinesToRun, interpreterListener, state));
            // add each processed message-stream combination to the blacklist set and figure out if the processing
            // has added a stream to the message, in which case we need to cycle and determine whether to process
            // its pipeline connections, too
            boolean addedStreams = updateStreamBlacklist(processingBlacklist, message, initialStreamIds);
            potentiallyDropFilteredMessage(message);
            // go to 1 and iterate over all messages again until no more streams are being assigned
            if (!addedStreams || message.getFilterOut()) {
                log.debug("[{}] no new streams matches or dropped message, not running again", msgId);
                fullyProcessed.add(message);
            } else {
                // process again, we've added a stream
                log.debug("[{}] new streams assigned, running again for those streams", msgId);
                toProcess.add(message);
            }
        }
    }
    interpreterListener.finishProcessing();
    // 7. return the processed messages
    return new MessageCollection(fullyProcessed);
}
Also used : MessageCollection(org.graylog2.plugin.MessageCollection) Message(org.graylog2.plugin.Message) Tuple2(org.jooq.lambda.tuple.Tuple2) Pipeline(org.graylog.plugins.pipelineprocessor.ast.Pipeline)

Example 2 with MessageCollection

use of org.graylog2.plugin.MessageCollection in project graylog2-server by Graylog2.

the class PipelineInterpreterTest method testDroppedMessageWillHaltProcessingAfterCurrentStage.

@Test
public void testDroppedMessageWillHaltProcessingAfterCurrentStage() {
    final RuleService ruleService = mock(MongoDbRuleService.class);
    when(ruleService.loadAll()).thenReturn(ImmutableList.of(RULE_SET_FIELD.apply("1-a"), RULE_SET_FIELD.apply("1-b"), RULE_SET_FIELD.apply("2-a"), RULE_SET_FIELD.apply("2-b"), RULE_DROP_MESSAGE));
    final PipelineService pipelineService = mock(MongoDbPipelineService.class);
    when(pipelineService.loadAll()).thenReturn(ImmutableList.of(PipelineDao.create("p1", "title1", "description", "pipeline \"pipeline1\"\n" + "stage 0 match pass\n" + "    rule \"1-a\";\n" + "    rule \"drop_message\";\n" + "stage 1 match pass\n" + "    rule \"1-b\";\n" + "end\n", Tools.nowUTC(), null), PipelineDao.create("p2", "title2", "description", "pipeline \"pipeline2\"\n" + "stage 0 match pass\n" + "    rule \"2-a\";\n" + "stage 1 match pass\n" + "    rule \"2-b\";\n" + "end\n", Tools.nowUTC(), null)));
    final Map<String, Function<?>> functions = ImmutableMap.of(SetField.NAME, new SetField(), DropMessage.NAME, new DropMessage());
    final PipelineInterpreter interpreter = createPipelineInterpreter(ruleService, pipelineService, functions);
    final Messages processed = interpreter.process(messageInDefaultStream("message", "test"));
    assertThat(processed).isInstanceOf(MessageCollection.class);
    // Use MessageCollection#source here to get access to the unfiltered messages
    final List<Message> messages = ImmutableList.copyOf(((MessageCollection) processed).source());
    assertThat(messages).hasSize(1);
    final Message actualMessage = messages.get(0);
    assertThat(actualMessage.getFilterOut()).isTrue();
    // Even though "drop_message" has been called in one of the stages, all stages of the same number should
    // have been executed
    assertThat(actualMessage.getFieldAs(String.class, "1-a")).isEqualTo("value");
    assertThat(actualMessage.getFieldAs(String.class, "2-a")).isEqualTo("value");
    // The second stage in both pipelines should not have been executed due to the "drop_message" call
    assertThat(actualMessage.getField("1-b")).isNull();
    assertThat(actualMessage.getField("2-b")).isNull();
}
Also used : Function(org.graylog.plugins.pipelineprocessor.ast.functions.Function) Messages(org.graylog2.plugin.Messages) PipelineService(org.graylog.plugins.pipelineprocessor.db.PipelineService) MongoDbPipelineService(org.graylog.plugins.pipelineprocessor.db.mongodb.MongoDbPipelineService) InMemoryPipelineService(org.graylog.plugins.pipelineprocessor.db.memory.InMemoryPipelineService) CreateMessage(org.graylog.plugins.pipelineprocessor.functions.messages.CreateMessage) DropMessage(org.graylog.plugins.pipelineprocessor.functions.messages.DropMessage) Message(org.graylog2.plugin.Message) RuleService(org.graylog.plugins.pipelineprocessor.db.RuleService) MongoDbRuleService(org.graylog.plugins.pipelineprocessor.db.mongodb.MongoDbRuleService) InMemoryRuleService(org.graylog.plugins.pipelineprocessor.db.memory.InMemoryRuleService) SetField(org.graylog.plugins.pipelineprocessor.functions.messages.SetField) DropMessage(org.graylog.plugins.pipelineprocessor.functions.messages.DropMessage) Test(org.junit.Test)

Aggregations

Message (org.graylog2.plugin.Message)2 Pipeline (org.graylog.plugins.pipelineprocessor.ast.Pipeline)1 Function (org.graylog.plugins.pipelineprocessor.ast.functions.Function)1 PipelineService (org.graylog.plugins.pipelineprocessor.db.PipelineService)1 RuleService (org.graylog.plugins.pipelineprocessor.db.RuleService)1 InMemoryPipelineService (org.graylog.plugins.pipelineprocessor.db.memory.InMemoryPipelineService)1 InMemoryRuleService (org.graylog.plugins.pipelineprocessor.db.memory.InMemoryRuleService)1 MongoDbPipelineService (org.graylog.plugins.pipelineprocessor.db.mongodb.MongoDbPipelineService)1 MongoDbRuleService (org.graylog.plugins.pipelineprocessor.db.mongodb.MongoDbRuleService)1 CreateMessage (org.graylog.plugins.pipelineprocessor.functions.messages.CreateMessage)1 DropMessage (org.graylog.plugins.pipelineprocessor.functions.messages.DropMessage)1 SetField (org.graylog.plugins.pipelineprocessor.functions.messages.SetField)1 MessageCollection (org.graylog2.plugin.MessageCollection)1 Messages (org.graylog2.plugin.Messages)1 Tuple2 (org.jooq.lambda.tuple.Tuple2)1 Test (org.junit.Test)1