Search in sources :

Example 81 with PipedOutputStream

use of java.io.PipedOutputStream in project AndroidLife by CaMnter.

the class ApplicationPatchTest method checkApplicationPatchReadWrite.

@Test
public void checkApplicationPatchReadWrite() throws IOException {
    PipedInputStream input = new PipedInputStream();
    PipedOutputStream outputStream = new PipedOutputStream(input);
    try {
        DataOutputStream output = new DataOutputStream(outputStream);
        ApplicationPatchUtil.write(output, mPatches, UpdateMode.HOT_SWAP);
        List<ApplicationPatch> patches = ApplicationPatch.read(new DataInputStream(input));
        assertNotNull(patches);
        assertEquals("Should not lose or gain patches", mPatches.size(), patches.size());
        for (int i = 0; i < mPatches.size(); i++) {
            ApplicationPatch expected = mPatches.get(i);
            ApplicationPatch actual = patches.get(i);
            mExpect.that(actual.getBytes()).isEqualTo(expected.getBytes());
            mExpect.that(actual.getPath()).isEqualTo(expected.getPath());
        }
    } finally {
        outputStream.close();
    }
}
Also used : DataOutputStream(java.io.DataOutputStream) PipedOutputStream(java.io.PipedOutputStream) PipedInputStream(java.io.PipedInputStream) DataInputStream(java.io.DataInputStream) Test(org.junit.Test)

Example 82 with PipedOutputStream

use of java.io.PipedOutputStream in project BiglyBT by BiglySoftware.

the class LocaleUtilitiesImpl method integrateLocalisedMessageBundle.

@Override
public void integrateLocalisedMessageBundle(Properties p) {
    // Surely there's a more convenient way of doing this?
    ResourceBundle rb = null;
    try {
        PipedInputStream in_stream = new PipedInputStream();
        PipedOutputStream out_stream = new PipedOutputStream(in_stream);
        p.store(out_stream, "");
        out_stream.close();
        rb = new PropertyResourceBundle(in_stream);
        in_stream.close();
    } catch (IOException ioe) {
        return;
    }
    integrateLocalisedMessageBundle(rb);
}
Also used : PipedOutputStream(java.io.PipedOutputStream) PipedInputStream(java.io.PipedInputStream) IOException(java.io.IOException)

Example 83 with PipedOutputStream

use of java.io.PipedOutputStream in project AndrOBD by fr3ts0n.

the class UdpInputStream method setSocket.

/**
 * set datagram socket for data reading
 */
public void setSocket(DatagramSocket newSocket) {
    // only do something if the socket really changes
    if (newSocket != socket) {
        try {
            closeItAll();
            log.info("Opening UdpInputStream");
            out = new PipedOutputStream(this);
            // set socket to the new one
            socket = newSocket;
            // create a new reader thread ...
            readerThread = new Thread() {

                public void run() {
                    log.info("UdpReader started");
                    try {
                        while (!isInterrupted()) {
                            packet.setLength(buffer.length);
                            socket.receive(packet);
                            charsReceived += packet.getLength();
                            out.write(packet.getData(), 0, packet.getLength());
                            log.fine("RX:" + new String(packet.getData()));
                        }
                    } catch (IOException e) {
                        log.severe(e.toString());
                    }
                    log.info("UdpReader finished");
                }
            };
            // mark socket staring time
            startTime = System.currentTimeMillis();
            // set number of chars received to zero
            charsReceived = 0;
            // and start listening ...
            readerThread.start();
        } catch (IOException e) {
            log.severe(e.toString());
        }
    }
}
Also used : PipedOutputStream(java.io.PipedOutputStream) IOException(java.io.IOException)

Example 84 with PipedOutputStream

use of java.io.PipedOutputStream in project ANNIS by korpling.

the class AbstractDotVisualizer method createComponent.

@Override
public ImagePanel createComponent(final VisualizerInput visInput, VisualizationToggle visToggle) {
    try {
        final PipedOutputStream out = new PipedOutputStream();
        final PipedInputStream in = new PipedInputStream(out);
        new Thread(new Runnable() {

            @Override
            public void run() {
                writeOutput(visInput, out);
            }
        }).start();
        String fileName = "dotvis_" + new Random().nextInt(Integer.MAX_VALUE) + ".png";
        StreamResource resource = new StreamResource(new StreamResource.StreamSource() {

            @Override
            public InputStream getStream() {
                return in;
            }
        }, fileName);
        Embedded emb = new Embedded("", resource);
        emb.setMimeType("image/png");
        emb.setSizeFull();
        emb.setStandby("loading image");
        emb.setAlternateText("DOT graph visualization");
        return new ImagePanel(emb);
    } catch (IOException ex) {
        log.error(null, ex);
    }
    return new ImagePanel(new Embedded());
}
Also used : PipedInputStream(java.io.PipedInputStream) InputStream(java.io.InputStream) PipedOutputStream(java.io.PipedOutputStream) PipedInputStream(java.io.PipedInputStream) IOException(java.io.IOException) Random(java.util.Random) StreamResource(com.vaadin.server.StreamResource) Embedded(com.vaadin.ui.Embedded) ImagePanel(annis.libgui.ImagePanel)

Example 85 with PipedOutputStream

use of java.io.PipedOutputStream in project kafka-connect-vertica by jcustenborder.

the class VerticaSinkTask method put.

@Override
public void put(Collection<SinkRecord> records) {
    Multimap<String, SinkRecord> recordsByTable = HashMultimap.create(this.config.expectedTopics, this.config.expectedRecords);
    Multiset<String> countsByTable = HashMultiset.create(this.config.expectedTopics);
    for (SinkRecord record : records) {
        String table = record.topic();
        countsByTable.add(table);
        recordsByTable.put(table, record);
    }
    for (String table : countsByTable.elementSet()) {
        log.trace("put() - Writing {} record(s) to {}", countsByTable.count(table), table);
    }
    DataSource dataSource = PoolOfPools.get(this.config);
    try (Connection connection = dataSource.getConnection()) {
        VerticaConnection verticaConnection = connection.unwrap(VerticaConnection.class);
        try {
            for (final String tableName : recordsByTable.keys()) {
                log.trace("put() - Processing records for table '{}'", tableName);
                Collection<SinkRecord> tableRecords = recordsByTable.get(tableName);
                VerticaStreamWriterBuilder builder = configureBuilder(verticaConnection, tableName);
                final String statement = new QueryBuilder(builder).toString();
                log.info("put() - Creating VerticaCopyStream with statement:\n{}", statement);
                VerticaCopyStream copyStream = new VerticaCopyStream(verticaConnection, statement);
                copyStream.start();
                final PipedInputStream inputStream = new PipedInputStream(this.config.inputBufferSize);
                final PipedOutputStream outputStream = new PipedOutputStream(inputStream);
                try {
                    Stopwatch stopwatch = Stopwatch.createStarted();
                    Future<?> importFuture = executorService.submit(() -> {
                        try {
                            copyStream.addStream(inputStream);
                            copyStream.execute();
                        } catch (SQLException e) {
                            throw new IllegalStateException(e);
                        }
                    });
                    int count = 0;
                    try (VerticaStreamWriter writer = builder.build(outputStream)) {
                        for (SinkRecord record : tableRecords) {
                            Struct value = (Struct) record.value();
                            int i = 0;
                            Object[] values = new Object[writer.columns().size()];
                            for (VerticaColumnInfo columnInfo : writer.columns()) {
                                values[i] = value.get(columnInfo.name());
                                i++;
                            }
                            log.trace("Writing row");
                            writer.write(values);
                            count++;
                        }
                        log.info("Wrote {} record(s) to stream", count);
                    }
                    outputStream.close();
                    log.info("Waiting for import to complete.");
                    try {
                        importFuture.get(this.config.streamTimeoutMs, TimeUnit.MILLISECONDS);
                    } catch (TimeoutException e) {
                        log.warn("Import exceeded timeout of {} ms. Rolling back", this.config.streamTimeoutMs);
                        connection.rollback();
                    }
                    log.info("put() - Imported {} record(s) in {} millisecond(s).", count, stopwatch.elapsed(TimeUnit.MILLISECONDS));
                    final int rejectedCount = copyStream.getRejects().size();
                    if (rejectedCount > 0) {
                        log.warn("put() - Rejected {} record(s).", copyStream.getRejects().size());
                        for (Long l : copyStream.getRejects()) {
                            log.warn("Rejected row {}", l);
                        }
                    }
                } catch (InterruptedException | ExecutionException e) {
                    log.error("Exception thrown", e);
                } finally {
                    inputStream.close();
                }
            }
        } catch (IOException ex) {
            throw new RetriableException(ex);
        } catch (ExecutionException ex) {
            throw new RetriableException(ex);
        }
        log.trace("put() - committing transaction");
        connection.commit();
    } catch (SQLException ex) {
        throw new RetriableException(ex);
    }
}
Also used : SQLException(java.sql.SQLException) Stopwatch(com.google.common.base.Stopwatch) PipedOutputStream(java.io.PipedOutputStream) QueryBuilder(com.github.jcustenborder.vertica.QueryBuilder) Struct(org.apache.kafka.connect.data.Struct) VerticaConnection(com.vertica.jdbc.VerticaConnection) VerticaCopyStream(com.vertica.jdbc.VerticaCopyStream) ExecutionException(java.util.concurrent.ExecutionException) TimeoutException(java.util.concurrent.TimeoutException) VerticaColumnInfo(com.github.jcustenborder.vertica.VerticaColumnInfo) Connection(java.sql.Connection) VerticaConnection(com.vertica.jdbc.VerticaConnection) PipedInputStream(java.io.PipedInputStream) IOException(java.io.IOException) SinkRecord(org.apache.kafka.connect.sink.SinkRecord) DataSource(javax.sql.DataSource) VerticaStreamWriter(com.github.jcustenborder.vertica.VerticaStreamWriter) VerticaStreamWriterBuilder(com.github.jcustenborder.vertica.VerticaStreamWriterBuilder) RetriableException(org.apache.kafka.connect.errors.RetriableException)

Aggregations

PipedOutputStream (java.io.PipedOutputStream)221 PipedInputStream (java.io.PipedInputStream)199 IOException (java.io.IOException)89 Test (org.junit.Test)54 InputStream (java.io.InputStream)30 OutputStream (java.io.OutputStream)23 BinaryDecoder (co.cask.cdap.common.io.BinaryDecoder)21 BinaryEncoder (co.cask.cdap.common.io.BinaryEncoder)21 PrintStream (java.io.PrintStream)21 ByteArrayOutputStream (java.io.ByteArrayOutputStream)19 ReflectionDatumReader (co.cask.cdap.internal.io.ReflectionDatumReader)17 TypeToken (com.google.common.reflect.TypeToken)17 InputStreamReader (java.io.InputStreamReader)16 DataInputStream (java.io.DataInputStream)14 DataOutputStream (java.io.DataOutputStream)14 BufferedReader (java.io.BufferedReader)13 Before (org.junit.Before)12 ByteArrayInputStream (java.io.ByteArrayInputStream)10 ExecutorService (java.util.concurrent.ExecutorService)9 ArrayList (java.util.ArrayList)7