Search in sources :

Example 36 with Pipeline

use of com.hazelcast.jet.pipeline.Pipeline in project hazelcast-jet-reference-manual by hazelcast.

the class FileAndSocket method s4.

static void s4() {
    // tag::s4[]
    Pipeline p = Pipeline.create();
    p.drawFrom(Sources.socket("localhost", 8080, StandardCharsets.UTF_8)).drainTo(Sinks.logger());
// end::s4[]
}
Also used : Pipeline(com.hazelcast.jet.pipeline.Pipeline)

Example 37 with Pipeline

use of com.hazelcast.jet.pipeline.Pipeline in project hazelcast-jet-reference-manual by hazelcast.

the class FileAndSocket method s3.

static void s3() {
    // tag::s3[]
    Pipeline p = Pipeline.create();
    p.drawFrom(Sources.list("inputList")).drainTo(Sinks.files("/home/jet/output"));
// end::s3[]
}
Also used : Pipeline(com.hazelcast.jet.pipeline.Pipeline)

Example 38 with Pipeline

use of com.hazelcast.jet.pipeline.Pipeline in project hazelcast-jet-reference-manual by hazelcast.

the class HdfsAndKafka method s3.

static void s3() {
    // tag::s3[]
    Properties props = new Properties();
    props.setProperty("bootstrap.servers", "localhost:9092");
    props.setProperty("key.serializer", StringSerializer.class.getCanonicalName());
    props.setProperty("key.deserializer", StringDeserializer.class.getCanonicalName());
    props.setProperty("value.serializer", IntegerSerializer.class.getCanonicalName());
    props.setProperty("value.deserializer", IntegerDeserializer.class.getCanonicalName());
    props.setProperty("auto.offset.reset", "earliest");
    Pipeline p = Pipeline.create();
    p.drawFrom(KafkaSources.kafka(props, "t1", "t2")).drainTo(KafkaSinks.kafka(props, "t3"));
// end::s3[]
}
Also used : IntegerDeserializer(org.apache.kafka.common.serialization.IntegerDeserializer) StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) Properties(java.util.Properties) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) Pipeline(com.hazelcast.jet.pipeline.Pipeline)

Example 39 with Pipeline

use of com.hazelcast.jet.pipeline.Pipeline in project hazelcast-jet-reference-manual by hazelcast.

the class HdfsAndKafka method s2.

static void s2() {
    JobConf jobConfig = new JobConf();
    // tag::s2[]
    Pipeline p = Pipeline.create();
    p.drawFrom(HdfsSources.hdfs(jobConfig, (k, v) -> v.toString())).flatMap(line -> traverseArray(line.toLowerCase().split("\\W+")).filter(w -> !w.isEmpty())).groupingKey(wholeItem()).aggregate(counting()).drainTo(HdfsSinks.hdfs(jobConfig));
// end::s2[]
}
Also used : TextInputFormat(org.apache.hadoop.mapred.TextInputFormat) AggregateOperations.counting(com.hazelcast.jet.aggregate.AggregateOperations.counting) Properties(java.util.Properties) HdfsSinks(com.hazelcast.jet.hadoop.HdfsSinks) Pipeline(com.hazelcast.jet.pipeline.Pipeline) KafkaSources(com.hazelcast.jet.kafka.KafkaSources) TextOutputFormat(org.apache.hadoop.mapred.TextOutputFormat) DistributedFunctions.wholeItem(com.hazelcast.jet.function.DistributedFunctions.wholeItem) JobConf(org.apache.hadoop.mapred.JobConf) HdfsSources(com.hazelcast.jet.hadoop.HdfsSources) StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) Traversers.traverseArray(com.hazelcast.jet.Traversers.traverseArray) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) Path(org.apache.hadoop.fs.Path) IntegerDeserializer(org.apache.kafka.common.serialization.IntegerDeserializer) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) KafkaSinks(com.hazelcast.jet.kafka.KafkaSinks) JobConf(org.apache.hadoop.mapred.JobConf) Pipeline(com.hazelcast.jet.pipeline.Pipeline)

Example 40 with Pipeline

use of com.hazelcast.jet.pipeline.Pipeline in project hazelcast-jet-reference-manual by hazelcast.

the class ImdgConnectors method s13.

static void s13() {
    ClientConfig someClientConfig = new ClientConfig();
    // tag::s13[]
    Pipeline p = Pipeline.create();
    StreamStage<Entry<String, Long>> fromRemoteMap = p.drawFrom(Sources.<String, Long>remoteMapJournal("inputMap", someClientConfig, START_FROM_CURRENT));
    StreamStage<Entry<String, Long>> fromRemoteCache = p.drawFrom(Sources.<String, Long>remoteCacheJournal("inputCache", someClientConfig, START_FROM_CURRENT));
// end::s13[]
}
Also used : Entry(java.util.Map.Entry) ClientConfig(com.hazelcast.client.config.ClientConfig) Pipeline(com.hazelcast.jet.pipeline.Pipeline)

Aggregations

Pipeline (com.hazelcast.jet.pipeline.Pipeline)379 Test (org.junit.Test)300 ParallelJVMTest (com.hazelcast.test.annotation.ParallelJVMTest)142 QuickTest (com.hazelcast.test.annotation.QuickTest)142 Job (com.hazelcast.jet.Job)125 Sinks (com.hazelcast.jet.pipeline.Sinks)107 Category (org.junit.experimental.categories.Category)100 HazelcastInstance (com.hazelcast.core.HazelcastInstance)94 JobConfig (com.hazelcast.jet.config.JobConfig)86 Assert.assertEquals (org.junit.Assert.assertEquals)73 List (java.util.List)72 NightlyTest (com.hazelcast.test.annotation.NightlyTest)65 Before (org.junit.Before)64 Entry (java.util.Map.Entry)61 TestSources (com.hazelcast.jet.pipeline.test.TestSources)58 Assert.assertTrue (org.junit.Assert.assertTrue)50 Sources (com.hazelcast.jet.pipeline.Sources)49 IOException (java.io.IOException)48 BeforeClass (org.junit.BeforeClass)48 SimpleTestInClusterSupport (com.hazelcast.jet.SimpleTestInClusterSupport)42