Search in sources :

Example 36 with Tuple

use of org.apache.flink.api.java.tuple.Tuple in project flink by apache.

the class ScatterFunction method sendMessageToAllNeighbors.

/**
	 * Sends the given message to all vertices that are targets of an edge of the changed vertex.
	 * This method is mutually exclusive to the method {@link #getEdges()} and may be called only once.
	 * <p>
	 * If the {@link EdgeDirection} is OUT (default), the message will be sent to out-neighbors.
	 * If the {@link EdgeDirection} is IN, the message will be sent to in-neighbors.
	 * If the {@link EdgeDirection} is ALL, the message will be sent to all neighbors.
	 * 
	 * @param m The message to send.
	 */
public void sendMessageToAllNeighbors(Message m) {
    if (edgesUsed) {
        throw new IllegalStateException("Can use either 'getEdges()' or 'sendMessageToAllNeighbors()'" + "exactly once.");
    }
    edgesUsed = true;
    outValue.f1 = m;
    while (edges.hasNext()) {
        Tuple next = (Tuple) edges.next();
        /*
			 * When EdgeDirection is OUT, the edges iterator only has the out-edges 
			 * of the vertex, i.e. the ones where this vertex is src. 
			 * next.getField(1) gives the neighbor of the vertex running this ScatterFunction.
			 */
        if (getDirection().equals(EdgeDirection.OUT)) {
            outValue.f0 = next.getField(1);
        } else /*
			 * When EdgeDirection is IN, the edges iterator only has the in-edges 
			 * of the vertex, i.e. the ones where this vertex is trg. 
			 * next.getField(10) gives the neighbor of the vertex running this ScatterFunction.
			 */
        if (getDirection().equals(EdgeDirection.IN)) {
            outValue.f0 = next.getField(0);
        }
        // When EdgeDirection is ALL, the edges iterator contains both in- and out- edges
        if (getDirection().equals(EdgeDirection.ALL)) {
            if (next.getField(0).equals(vertexId)) {
                // send msg to the trg
                outValue.f0 = next.getField(1);
            } else {
                // send msg to the src
                outValue.f0 = next.getField(0);
            }
        }
        out.collect(outValue);
    }
}
Also used : Tuple(org.apache.flink.api.java.tuple.Tuple)

Example 37 with Tuple

use of org.apache.flink.api.java.tuple.Tuple in project flink by apache.

the class PythonPlanBinder method receiveParameters.

private void receiveParameters() throws IOException {
    for (int x = 0; x < 3; x++) {
        Tuple value = (Tuple) streamer.getRecord(true);
        switch(Parameters.valueOf(((String) value.getField(0)).toUpperCase())) {
            case DOP:
                Integer dop = (Integer) value.getField(1);
                env.setParallelism(dop);
                break;
            case MODE:
                FLINK_HDFS_PATH = (Boolean) value.getField(1) ? "file:/tmp/flink" : "hdfs:/tmp/flink";
                break;
            case RETRY:
                int retry = (Integer) value.getField(1);
                env.setRestartStrategy(RestartStrategies.fixedDelayRestart(retry, 10000L));
                break;
        }
    }
    if (env.getParallelism() < 0) {
        env.setParallelism(1);
    }
}
Also used : DatasizeHint(org.apache.flink.python.api.PythonOperationInfo.DatasizeHint) Tuple(org.apache.flink.api.java.tuple.Tuple)

Example 38 with Tuple

use of org.apache.flink.api.java.tuple.Tuple in project flink by apache.

the class FlinkOutputFieldsDeclarer method getOutputType.

/**
	 * Returns {@link TypeInformation} for the declared output schema for a specific stream.
	 * 
	 * @param streamId
	 *            A stream ID.
	 * 
	 * @return output type information for the declared output schema of the specified stream; or {@code null} if
	 *         {@code streamId == null}
	 * 
	 * @throws IllegalArgumentException
	 *             If no output schema was declared for the specified stream or if more then 25 attributes got declared.
	 */
TypeInformation<Tuple> getOutputType(final String streamId) throws IllegalArgumentException {
    if (streamId == null) {
        return null;
    }
    Fields outputSchema = this.outputStreams.get(streamId);
    if (outputSchema == null) {
        throw new IllegalArgumentException("Stream with ID '" + streamId + "' was not declared.");
    }
    Tuple t;
    final int numberOfAttributes = outputSchema.size();
    if (numberOfAttributes <= 24) {
        try {
            t = Tuple.getTupleClass(numberOfAttributes + 1).newInstance();
        } catch (final InstantiationException e) {
            throw new RuntimeException(e);
        } catch (final IllegalAccessException e) {
            throw new RuntimeException(e);
        }
    } else {
        throw new IllegalArgumentException("Flink supports only a maximum number of 24 attributes");
    }
    // TODO: declare only key fields as DefaultComparable
    for (int i = 0; i < numberOfAttributes + 1; ++i) {
        t.setField(new DefaultComparable(), i);
    }
    return TypeExtractor.getForObject(t);
}
Also used : Fields(org.apache.storm.tuple.Fields) Tuple(org.apache.flink.api.java.tuple.Tuple)

Example 39 with Tuple

use of org.apache.flink.api.java.tuple.Tuple in project flink by apache.

the class BoltCollectorTest method testBoltStormCollectorWithTaskId.

@SuppressWarnings({ "rawtypes", "unchecked" })
@Test
public void testBoltStormCollectorWithTaskId() throws InstantiationException, IllegalAccessException {
    for (int numberOfAttributes = 0; numberOfAttributes < 25; ++numberOfAttributes) {
        final Output flinkCollector = mock(Output.class);
        final int taskId = 42;
        final String streamId = "streamId";
        HashMap<String, Integer> attributes = new HashMap<String, Integer>();
        attributes.put(streamId, numberOfAttributes);
        BoltCollector<?> collector = new BoltCollector(attributes, taskId, flinkCollector);
        final Values tuple = new Values();
        final Tuple flinkTuple = Tuple.getTupleClass(numberOfAttributes + 1).newInstance();
        for (int i = 0; i < numberOfAttributes; ++i) {
            tuple.add(new Integer(this.r.nextInt()));
            flinkTuple.setField(tuple.get(i), i);
        }
        flinkTuple.setField(taskId, numberOfAttributes);
        final Collection anchors = mock(Collection.class);
        final List<Integer> taskIds;
        taskIds = collector.emit(streamId, anchors, tuple);
        Assert.assertNull(taskIds);
        verify(flinkCollector).collect(flinkTuple);
    }
}
Also used : HashMap(java.util.HashMap) Output(org.apache.flink.streaming.api.operators.Output) Values(org.apache.storm.tuple.Values) Collection(java.util.Collection) Tuple(org.apache.flink.api.java.tuple.Tuple) AbstractTest(org.apache.flink.storm.util.AbstractTest) Test(org.junit.Test)

Example 40 with Tuple

use of org.apache.flink.api.java.tuple.Tuple in project flink by apache.

the class BoltWrapperTest method testWrapper.

@SuppressWarnings({ "rawtypes", "unchecked" })
private void testWrapper(final int numberOfAttributes) throws Exception {
    assert ((-1 <= numberOfAttributes) && (numberOfAttributes <= 25));
    Tuple flinkTuple = null;
    String rawTuple = null;
    if (numberOfAttributes == -1) {
        rawTuple = "test";
    } else {
        flinkTuple = Tuple.getTupleClass(numberOfAttributes).newInstance();
    }
    final String[] schema;
    if (numberOfAttributes == -1) {
        schema = new String[1];
    } else {
        schema = new String[numberOfAttributes];
    }
    for (int i = 0; i < schema.length; ++i) {
        schema[i] = "a" + i;
    }
    final StreamRecord record = mock(StreamRecord.class);
    if (numberOfAttributes == -1) {
        when(record.getValue()).thenReturn(rawTuple);
    } else {
        when(record.getValue()).thenReturn(flinkTuple);
    }
    final StreamingRuntimeContext taskContext = mock(StreamingRuntimeContext.class);
    when(taskContext.getExecutionConfig()).thenReturn(mock(ExecutionConfig.class));
    when(taskContext.getTaskName()).thenReturn("name");
    when(taskContext.getMetricGroup()).thenReturn(new UnregisteredMetricsGroup());
    final IRichBolt bolt = mock(IRichBolt.class);
    final SetupOutputFieldsDeclarer declarer = new SetupOutputFieldsDeclarer();
    declarer.declare(new Fields(schema));
    PowerMockito.whenNew(SetupOutputFieldsDeclarer.class).withNoArguments().thenReturn(declarer);
    final BoltWrapper wrapper = new BoltWrapper(bolt, (Fields) null);
    wrapper.setup(createMockStreamTask(), new StreamConfig(new Configuration()), mock(Output.class));
    wrapper.open();
    wrapper.processElement(record);
    if (numberOfAttributes == -1) {
        verify(bolt).execute(eq(new StormTuple<String>(rawTuple, null, -1, null, null, MessageId.makeUnanchored())));
    } else {
        verify(bolt).execute(eq(new StormTuple<Tuple>(flinkTuple, null, -1, null, null, MessageId.makeUnanchored())));
    }
}
Also used : IRichBolt(org.apache.storm.topology.IRichBolt) UnregisteredMetricsGroup(org.apache.flink.metrics.groups.UnregisteredMetricsGroup) StreamRecord(org.apache.flink.streaming.runtime.streamrecord.StreamRecord) StreamingRuntimeContext(org.apache.flink.streaming.api.operators.StreamingRuntimeContext) UnmodifiableConfiguration(org.apache.flink.configuration.UnmodifiableConfiguration) Configuration(org.apache.flink.configuration.Configuration) StreamConfig(org.apache.flink.streaming.api.graph.StreamConfig) ExecutionConfig(org.apache.flink.api.common.ExecutionConfig) Fields(org.apache.storm.tuple.Fields) Output(org.apache.flink.streaming.api.operators.Output) Tuple(org.apache.flink.api.java.tuple.Tuple)

Aggregations

Tuple (org.apache.flink.api.java.tuple.Tuple)59 Test (org.junit.Test)38 AbstractTest (org.apache.flink.storm.util.AbstractTest)17 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)14 StreamExecutionEnvironment (org.apache.flink.streaming.api.environment.StreamExecutionEnvironment)14 TimeWindow (org.apache.flink.streaming.api.windowing.windows.TimeWindow)13 Tuple5 (org.apache.flink.api.java.tuple.Tuple5)10 ArrayList (java.util.ArrayList)9 Configuration (org.apache.flink.configuration.Configuration)8 SuccessException (org.apache.flink.test.util.SuccessException)7 IOException (java.io.IOException)6 HashMap (java.util.HashMap)6 ExecutionConfig (org.apache.flink.api.common.ExecutionConfig)6 Fields (org.apache.storm.tuple.Fields)6 Tuple4 (org.apache.flink.api.java.tuple.Tuple4)5 OneInputTransformation (org.apache.flink.streaming.api.transformations.OneInputTransformation)5 Keys (org.apache.flink.api.common.operators.Keys)4 TypeInformation (org.apache.flink.api.common.typeinfo.TypeInformation)4 ComparableAggregator (org.apache.flink.streaming.api.functions.aggregation.ComparableAggregator)4 Values (org.apache.storm.tuple.Values)4