Search in sources :

Example 1 with StandardEvent

use of org.apache.nifi.processor.util.listen.event.StandardEvent in project nifi by apache.

the class ListenTCP method createDispatcher.

@Override
protected ChannelDispatcher createDispatcher(final ProcessContext context, final BlockingQueue<StandardEvent> events) throws IOException {
    final int maxConnections = context.getProperty(MAX_CONNECTIONS).asInteger();
    final int bufferSize = context.getProperty(RECV_BUFFER_SIZE).asDataSize(DataUnit.B).intValue();
    final Charset charSet = Charset.forName(context.getProperty(CHARSET).getValue());
    // initialize the buffer pool based on max number of connections and the buffer size
    final BlockingQueue<ByteBuffer> bufferPool = createBufferPool(maxConnections, bufferSize);
    // if an SSLContextService was provided then create an SSLContext to pass down to the dispatcher
    SSLContext sslContext = null;
    SslContextFactory.ClientAuth clientAuth = null;
    final SSLContextService sslContextService = context.getProperty(SSL_CONTEXT_SERVICE).asControllerService(SSLContextService.class);
    if (sslContextService != null) {
        final String clientAuthValue = context.getProperty(CLIENT_AUTH).getValue();
        sslContext = sslContextService.createSSLContext(SSLContextService.ClientAuth.valueOf(clientAuthValue));
        clientAuth = SslContextFactory.ClientAuth.valueOf(clientAuthValue);
    }
    final EventFactory<StandardEvent> eventFactory = new StandardEventFactory();
    final ChannelHandlerFactory<StandardEvent<SocketChannel>, AsyncChannelDispatcher> handlerFactory = new SocketChannelHandlerFactory<>();
    return new SocketChannelDispatcher(eventFactory, handlerFactory, bufferPool, events, getLogger(), maxConnections, sslContext, clientAuth, charSet);
}
Also used : SocketChannelHandlerFactory(org.apache.nifi.processor.util.listen.handler.socket.SocketChannelHandlerFactory) Charset(java.nio.charset.Charset) SSLContext(javax.net.ssl.SSLContext) ByteBuffer(java.nio.ByteBuffer) StandardEvent(org.apache.nifi.processor.util.listen.event.StandardEvent) SslContextFactory(org.apache.nifi.security.util.SslContextFactory) AsyncChannelDispatcher(org.apache.nifi.processor.util.listen.dispatcher.AsyncChannelDispatcher) SSLContextService(org.apache.nifi.ssl.SSLContextService) RestrictedSSLContextService(org.apache.nifi.ssl.RestrictedSSLContextService) StandardEventFactory(org.apache.nifi.processor.util.listen.event.StandardEventFactory) SocketChannelDispatcher(org.apache.nifi.processor.util.listen.dispatcher.SocketChannelDispatcher)

Example 2 with StandardEvent

use of org.apache.nifi.processor.util.listen.event.StandardEvent in project nifi by apache.

the class ListenUDPRecord method handleParseFailure.

private void handleParseFailure(final StandardEvent event, final ProcessSession session, final Exception cause, final String message) {
    // If we are unable to parse the data, we need to transfer it to 'parse failure' relationship
    final Map<String, String> attributes = getAttributes(event.getSender());
    FlowFile failureFlowFile = session.create();
    failureFlowFile = session.write(failureFlowFile, out -> out.write(event.getData()));
    failureFlowFile = session.putAllAttributes(failureFlowFile, attributes);
    final String transitUri = getTransitUri(event.getSender());
    session.getProvenanceReporter().receive(failureFlowFile, transitUri);
    session.transfer(failureFlowFile, REL_PARSE_FAILURE);
    if (cause == null) {
        getLogger().error(message);
    } else {
        getLogger().error(message, cause);
    }
    session.adjustCounter("Parse Failures", 1, false);
}
Also used : StandardValidators(org.apache.nifi.processor.util.StandardValidators) Arrays(java.util.Arrays) StringUtils(org.apache.commons.lang3.StringUtils) PropertyDescriptor(org.apache.nifi.components.PropertyDescriptor) ByteBuffer(java.nio.ByteBuffer) InetAddress(java.net.InetAddress) RecordSchema(org.apache.nifi.serialization.record.RecordSchema) ByteArrayInputStream(java.io.ByteArrayInputStream) WritesAttributes(org.apache.nifi.annotation.behavior.WritesAttributes) RecordReader(org.apache.nifi.serialization.RecordReader) Map(java.util.Map) EventFactory(org.apache.nifi.processor.util.listen.event.EventFactory) FlowFile(org.apache.nifi.flowfile.FlowFile) WriteResult(org.apache.nifi.serialization.WriteResult) Collection(java.util.Collection) BlockingQueue(java.util.concurrent.BlockingQueue) WritesAttribute(org.apache.nifi.annotation.behavior.WritesAttribute) IOUtils(org.apache.commons.io.IOUtils) InputRequirement(org.apache.nifi.annotation.behavior.InputRequirement) List(java.util.List) RecordReaderFactory(org.apache.nifi.serialization.RecordReaderFactory) Tags(org.apache.nifi.annotation.documentation.Tags) DataUnit(org.apache.nifi.processor.DataUnit) CapabilityDescription(org.apache.nifi.annotation.documentation.CapabilityDescription) ValidationContext(org.apache.nifi.components.ValidationContext) DatagramChannelDispatcher(org.apache.nifi.processor.util.listen.dispatcher.DatagramChannelDispatcher) HashMap(java.util.HashMap) ProcessException(org.apache.nifi.processor.exception.ProcessException) ArrayList(java.util.ArrayList) Relationship(org.apache.nifi.processor.Relationship) StandardEvent(org.apache.nifi.processor.util.listen.event.StandardEvent) ValidationResult(org.apache.nifi.components.ValidationResult) Record(org.apache.nifi.serialization.record.Record) OutputStream(java.io.OutputStream) Validator(org.apache.nifi.components.Validator) ChannelDispatcher(org.apache.nifi.processor.util.listen.dispatcher.ChannelDispatcher) ProcessContext(org.apache.nifi.processor.ProcessContext) ProcessSession(org.apache.nifi.processor.ProcessSession) RecordSetWriterFactory(org.apache.nifi.serialization.RecordSetWriterFactory) IOException(java.io.IOException) AbstractListenEventProcessor(org.apache.nifi.processor.util.listen.AbstractListenEventProcessor) UnknownHostException(java.net.UnknownHostException) TimeUnit(java.util.concurrent.TimeUnit) OnScheduled(org.apache.nifi.annotation.lifecycle.OnScheduled) SupportsBatching(org.apache.nifi.annotation.behavior.SupportsBatching) StandardEventFactory(org.apache.nifi.processor.util.listen.event.StandardEventFactory) CoreAttributes(org.apache.nifi.flowfile.attributes.CoreAttributes) RecordSetWriter(org.apache.nifi.serialization.RecordSetWriter) Collections(java.util.Collections) InputStream(java.io.InputStream) FlowFile(org.apache.nifi.flowfile.FlowFile)

Example 3 with StandardEvent

use of org.apache.nifi.processor.util.listen.event.StandardEvent in project nifi by apache.

the class TestListenUDPRecord method testParseFailure.

@Test
public void testParseFailure() {
    final String sender = "foo";
    final StandardEvent event1 = new StandardEvent(sender, DATAGRAM_1.getBytes(StandardCharsets.UTF_8), null);
    proc.addEvent(event1);
    final StandardEvent event2 = new StandardEvent(sender, "WILL NOT PARSE".getBytes(StandardCharsets.UTF_8), null);
    proc.addEvent(event2);
    runner.run();
    runner.assertTransferCount(ListenUDPRecord.REL_SUCCESS, 1);
    runner.assertTransferCount(ListenUDPRecord.REL_PARSE_FAILURE, 1);
    final MockFlowFile flowFile = runner.getFlowFilesForRelationship(ListenUDPRecord.REL_PARSE_FAILURE).get(0);
    flowFile.assertContentEquals("WILL NOT PARSE");
}
Also used : MockFlowFile(org.apache.nifi.util.MockFlowFile) StandardEvent(org.apache.nifi.processor.util.listen.event.StandardEvent) Test(org.junit.Test)

Example 4 with StandardEvent

use of org.apache.nifi.processor.util.listen.event.StandardEvent in project nifi by apache.

the class ListenUDPRecord method onTrigger.

@Override
public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    final int maxBatchSize = context.getProperty(BATCH_SIZE).asInteger();
    final Map<String, FlowFileRecordWriter> flowFileRecordWriters = new HashMap<>();
    final RecordReaderFactory readerFactory = context.getProperty(RECORD_READER).asControllerService(RecordReaderFactory.class);
    final RecordSetWriterFactory writerFactory = context.getProperty(RECORD_WRITER).asControllerService(RecordSetWriterFactory.class);
    for (int i = 0; i < maxBatchSize; i++) {
        // this processor isn't leveraging the error queue so don't bother polling to avoid the overhead
        // if the error handling is ever changed to use the error queue then this flag needs to be changed as well
        final StandardEvent event = getMessage(true, false, session);
        // break out if we don't have any messages, don't yield since we already do a long poll inside getMessage
        if (event == null) {
            break;
        }
        // attempt to read all of the records from the current datagram into a list in memory so that we can ensure the
        // entire datagram can be read as records, and if not transfer the whole thing to parse.failure
        final RecordReader reader;
        final List<Record> records = new ArrayList<>();
        try (final InputStream in = new ByteArrayInputStream(event.getData())) {
            reader = readerFactory.createRecordReader(Collections.emptyMap(), in, getLogger());
            Record record;
            while ((record = reader.nextRecord()) != null) {
                records.add(record);
            }
        } catch (final Exception e) {
            handleParseFailure(event, session, e);
            continue;
        }
        if (records.size() == 0) {
            handleParseFailure(event, session, null);
            continue;
        }
        // see if we already started a flow file and writer for the given sender
        // if an exception happens creating the flow file or writer, put the event in the error queue to try it again later
        FlowFileRecordWriter flowFileRecordWriter = flowFileRecordWriters.get(event.getSender());
        if (flowFileRecordWriter == null) {
            FlowFile flowFile = null;
            OutputStream rawOut = null;
            RecordSetWriter writer = null;
            try {
                flowFile = session.create();
                rawOut = session.write(flowFile);
                final Record firstRecord = records.get(0);
                final RecordSchema recordSchema = firstRecord.getSchema();
                final RecordSchema writeSchema = writerFactory.getSchema(Collections.emptyMap(), recordSchema);
                writer = writerFactory.createWriter(getLogger(), writeSchema, rawOut);
                writer.beginRecordSet();
                flowFileRecordWriter = new FlowFileRecordWriter(flowFile, writer);
                flowFileRecordWriters.put(event.getSender(), flowFileRecordWriter);
            } catch (final Exception ex) {
                getLogger().error("Failed to properly initialize record writer. Datagram will be queued for re-processing.", ex);
                try {
                    if (writer != null) {
                        writer.close();
                    }
                } catch (final Exception e) {
                    getLogger().warn("Failed to close Record Writer", e);
                }
                if (rawOut != null) {
                    IOUtils.closeQuietly(rawOut);
                }
                if (flowFile != null) {
                    session.remove(flowFile);
                }
                context.yield();
                break;
            }
        }
        // attempt to write each record, if any record fails then remove the flow file and break out of the loop
        final RecordSetWriter writer = flowFileRecordWriter.getRecordWriter();
        try {
            for (final Record record : records) {
                writer.write(record);
            }
        } catch (Exception e) {
            getLogger().error("Failed to write records due to: " + e.getMessage(), e);
            IOUtils.closeQuietly(writer);
            session.remove(flowFileRecordWriter.getFlowFile());
            flowFileRecordWriters.remove(event.getSender());
            break;
        }
    }
    for (final Map.Entry<String, FlowFileRecordWriter> entry : flowFileRecordWriters.entrySet()) {
        final String sender = entry.getKey();
        final FlowFileRecordWriter flowFileRecordWriter = entry.getValue();
        final RecordSetWriter writer = flowFileRecordWriter.getRecordWriter();
        FlowFile flowFile = flowFileRecordWriter.getFlowFile();
        try {
            final WriteResult writeResult;
            try {
                writeResult = writer.finishRecordSet();
            } finally {
                writer.close();
            }
            if (writeResult.getRecordCount() == 0) {
                session.remove(flowFile);
                continue;
            }
            final Map<String, String> attributes = new HashMap<>();
            attributes.putAll(getAttributes(sender));
            attributes.putAll(writeResult.getAttributes());
            attributes.put(CoreAttributes.MIME_TYPE.key(), writer.getMimeType());
            attributes.put(RECORD_COUNT_ATTR, String.valueOf(writeResult.getRecordCount()));
            flowFile = session.putAllAttributes(flowFile, attributes);
            session.transfer(flowFile, REL_SUCCESS);
            final String transitUri = getTransitUri(sender);
            session.getProvenanceReporter().receive(flowFile, transitUri);
        } catch (final Exception e) {
            getLogger().error("Unable to properly complete record set due to: " + e.getMessage(), e);
            session.remove(flowFile);
        }
    }
}
Also used : FlowFile(org.apache.nifi.flowfile.FlowFile) HashMap(java.util.HashMap) ByteArrayInputStream(java.io.ByteArrayInputStream) InputStream(java.io.InputStream) RecordReader(org.apache.nifi.serialization.RecordReader) OutputStream(java.io.OutputStream) ArrayList(java.util.ArrayList) RecordSetWriter(org.apache.nifi.serialization.RecordSetWriter) ProcessException(org.apache.nifi.processor.exception.ProcessException) IOException(java.io.IOException) UnknownHostException(java.net.UnknownHostException) RecordReaderFactory(org.apache.nifi.serialization.RecordReaderFactory) StandardEvent(org.apache.nifi.processor.util.listen.event.StandardEvent) WriteResult(org.apache.nifi.serialization.WriteResult) RecordSetWriterFactory(org.apache.nifi.serialization.RecordSetWriterFactory) ByteArrayInputStream(java.io.ByteArrayInputStream) Record(org.apache.nifi.serialization.record.Record) RecordSchema(org.apache.nifi.serialization.record.RecordSchema) Map(java.util.Map) HashMap(java.util.HashMap)

Example 5 with StandardEvent

use of org.apache.nifi.processor.util.listen.event.StandardEvent in project nifi by apache.

the class TestListenUDP method testBatchingWithDifferentSenders.

@Test
public void testBatchingWithDifferentSenders() throws IOException, InterruptedException {
    final String sender1 = "sender1";
    final String sender2 = "sender2";
    final ChannelResponder responder = Mockito.mock(ChannelResponder.class);
    final byte[] message = "test message".getBytes(StandardCharsets.UTF_8);
    final List<StandardEvent> mockEvents = new ArrayList<>();
    mockEvents.add(new StandardEvent(sender1, message, responder));
    mockEvents.add(new StandardEvent(sender1, message, responder));
    mockEvents.add(new StandardEvent(sender2, message, responder));
    mockEvents.add(new StandardEvent(sender2, message, responder));
    MockListenUDP mockListenUDP = new MockListenUDP(mockEvents);
    runner = TestRunners.newTestRunner(mockListenUDP);
    runner.setProperty(ListenRELP.PORT, "1");
    runner.setProperty(ListenRELP.MAX_BATCH_SIZE, "10");
    // sending 4 messages with a batch size of 10, but should get 2 FlowFiles because of different senders
    runner.run();
    runner.assertAllFlowFilesTransferred(ListenRELP.REL_SUCCESS, 2);
    verifyProvenance(2);
}
Also used : ChannelResponder(org.apache.nifi.processor.util.listen.response.ChannelResponder) StandardEvent(org.apache.nifi.processor.util.listen.event.StandardEvent) ArrayList(java.util.ArrayList) Test(org.junit.Test)

Aggregations

StandardEvent (org.apache.nifi.processor.util.listen.event.StandardEvent)9 Test (org.junit.Test)5 ArrayList (java.util.ArrayList)3 MockFlowFile (org.apache.nifi.util.MockFlowFile)3 ByteArrayInputStream (java.io.ByteArrayInputStream)2 IOException (java.io.IOException)2 InputStream (java.io.InputStream)2 OutputStream (java.io.OutputStream)2 UnknownHostException (java.net.UnknownHostException)2 ByteBuffer (java.nio.ByteBuffer)2 HashMap (java.util.HashMap)2 Map (java.util.Map)2 FlowFile (org.apache.nifi.flowfile.FlowFile)2 ProcessException (org.apache.nifi.processor.exception.ProcessException)2 StandardEventFactory (org.apache.nifi.processor.util.listen.event.StandardEventFactory)2 InetAddress (java.net.InetAddress)1 Charset (java.nio.charset.Charset)1 Arrays (java.util.Arrays)1 Collection (java.util.Collection)1 Collections (java.util.Collections)1