Search in sources :

Example 1 with InstructionOutputNode

use of org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode in project beam by apache.

the class IntrinsicMapTaskExecutorFactory method createOutputReceiversTransform.

/**
 * Returns a function which can convert {@link InstructionOutput}s into {@link OutputReceiver}s.
 */
static Function<Node, Node> createOutputReceiversTransform(final String stageName, final CounterFactory counterFactory) {
    return new TypeSafeNodeFunction<InstructionOutputNode>(InstructionOutputNode.class) {

        @Override
        public Node typedApply(InstructionOutputNode input) {
            InstructionOutput cloudOutput = input.getInstructionOutput();
            OutputReceiver outputReceiver = new OutputReceiver();
            Coder<?> coder = CloudObjects.coderFromCloudObject(CloudObject.fromSpec(cloudOutput.getCodec()));
            @SuppressWarnings("unchecked") ElementCounter outputCounter = new DataflowOutputCounter(cloudOutput.getName(), new ElementByteSizeObservableCoder<>(coder), counterFactory, NameContext.create(stageName, cloudOutput.getOriginalName(), cloudOutput.getSystemName(), cloudOutput.getName()));
            outputReceiver.addOutputCounter(outputCounter);
            return OutputReceiverNode.create(outputReceiver, coder, input.getPcollectionId());
        }
    };
}
Also used : InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) InstructionOutput(com.google.api.services.dataflow.model.InstructionOutput) OutputReceiver(org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver) TypeSafeNodeFunction(org.apache.beam.runners.dataflow.worker.graph.Networks.TypeSafeNodeFunction) ElementCounter(org.apache.beam.runners.dataflow.worker.util.common.worker.ElementCounter)

Example 2 with InstructionOutputNode

use of org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode in project beam by apache.

the class InsertFetchAndFilterStreamingSideInputNodesTest method testSdkParDoWithSideInput.

@Test
public void testSdkParDoWithSideInput() throws Exception {
    Pipeline p = Pipeline.create();
    PCollection<String> pc = p.apply(Create.of("a", "b", "c"));
    PCollectionView<List<String>> pcView = pc.apply(View.asList());
    pc.apply(ParDo.of(new TestDoFn()).withSideInputs(pcView));
    RunnerApi.Pipeline pipeline = PipelineTranslation.toProto(p);
    Node predecessor = createParDoNode("predecessor");
    InstructionOutputNode mainInput = InstructionOutputNode.create(new InstructionOutput(), "fakeId");
    Node sideInputParDo = createParDoNode(findParDoWithSideInput(pipeline));
    MutableNetwork<Node, Edge> network = createEmptyNetwork();
    network.addNode(predecessor);
    network.addNode(mainInput);
    network.addNode(sideInputParDo);
    network.addEdge(predecessor, mainInput, DefaultEdge.create());
    network.addEdge(mainInput, sideInputParDo, DefaultEdge.create());
    network = InsertFetchAndFilterStreamingSideInputNodes.with(pipeline).forNetwork(network);
    Node mainInputClone = InstructionOutputNode.create(mainInput.getInstructionOutput(), "fakeId");
    Node fetchAndFilter = FetchAndFilterStreamingSideInputsNode.create(pcView.getWindowingStrategyInternal(), ImmutableMap.of(pcView, ParDoTranslation.translateWindowMappingFn(pcView.getWindowMappingFn(), SdkComponents.create(PipelineOptionsFactory.create()))), NameContextsForTests.nameContextForTest());
    MutableNetwork<Node, Edge> expectedNetwork = createEmptyNetwork();
    expectedNetwork.addNode(predecessor);
    expectedNetwork.addNode(mainInputClone);
    expectedNetwork.addNode(fetchAndFilter);
    expectedNetwork.addNode(mainInput);
    expectedNetwork.addNode(sideInputParDo);
    expectedNetwork.addEdge(predecessor, mainInputClone, DefaultEdge.create());
    expectedNetwork.addEdge(mainInputClone, fetchAndFilter, DefaultEdge.create());
    expectedNetwork.addEdge(fetchAndFilter, mainInput, DefaultEdge.create());
    expectedNetwork.addEdge(mainInput, sideInputParDo, DefaultEdge.create());
    assertThatNetworksAreIdentical(expectedNetwork, network);
}
Also used : FetchAndFilterStreamingSideInputsNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.FetchAndFilterStreamingSideInputsNode) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) Node(org.apache.beam.runners.dataflow.worker.graph.Nodes.Node) InstructionOutput(com.google.api.services.dataflow.model.InstructionOutput) Pipeline(org.apache.beam.sdk.Pipeline) RunnerApi(org.apache.beam.model.pipeline.v1.RunnerApi) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) ArrayList(java.util.ArrayList) List(java.util.List) Edge(org.apache.beam.runners.dataflow.worker.graph.Edges.Edge) DefaultEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.DefaultEdge) Test(org.junit.Test)

Example 3 with InstructionOutputNode

use of org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode in project beam by apache.

the class MapTaskToNetworkFunctionTest method testMultipleOutput.

@Test
public void testMultipleOutput() {
    // /---> WriteA
    // Read ---> ParDo
    // \---> WriteB
    InstructionOutput readOutput = createInstructionOutput("Read.out");
    ParallelInstruction read = createParallelInstruction("Read", readOutput);
    read.setRead(new ReadInstruction());
    MultiOutputInfo parDoMultiOutput1 = createMultiOutputInfo("output1");
    MultiOutputInfo parDoMultiOutput2 = createMultiOutputInfo("output2");
    ParDoInstruction parDoInstruction = new ParDoInstruction();
    // Read.out
    parDoInstruction.setInput(createInstructionInput(0, 0));
    parDoInstruction.setMultiOutputInfos(ImmutableList.of(parDoMultiOutput1, parDoMultiOutput2));
    InstructionOutput parDoOutput1 = createInstructionOutput("ParDo.out1");
    InstructionOutput parDoOutput2 = createInstructionOutput("ParDo.out2");
    ParallelInstruction parDo = createParallelInstruction("ParDo", parDoOutput1, parDoOutput2);
    parDo.setParDo(parDoInstruction);
    WriteInstruction writeAInstruction = new WriteInstruction();
    // ParDo.out1
    writeAInstruction.setInput(createInstructionInput(1, 0));
    ParallelInstruction writeA = createParallelInstruction("WriteA");
    writeA.setWrite(writeAInstruction);
    WriteInstruction writeBInstruction = new WriteInstruction();
    // ParDo.out2
    writeBInstruction.setInput(createInstructionInput(1, 1));
    ParallelInstruction writeB = createParallelInstruction("WriteB");
    writeB.setWrite(writeBInstruction);
    MapTask mapTask = new MapTask();
    mapTask.setInstructions(ImmutableList.of(read, parDo, writeA, writeB));
    mapTask.setFactory(Transport.getJsonFactory());
    Network<Node, Edge> network = new MapTaskToNetworkFunction(IdGenerators.decrementingLongs()).apply(mapTask);
    assertNetworkProperties(network);
    assertEquals(7, network.nodes().size());
    assertEquals(6, network.edges().size());
    ParallelInstructionNode parDoNode = get(network, parDo);
    ParallelInstructionNode writeANode = get(network, writeA);
    ParallelInstructionNode writeBNode = get(network, writeB);
    InstructionOutputNode parDoOutput1Node = getOnlyPredecessor(network, writeANode);
    assertEquals(parDoOutput1, parDoOutput1Node.getInstructionOutput());
    InstructionOutputNode parDoOutput2Node = getOnlyPredecessor(network, writeBNode);
    assertEquals(parDoOutput2, parDoOutput2Node.getInstructionOutput());
    assertThat(network.successors(parDoNode), Matchers.<Node>containsInAnyOrder(parDoOutput1Node, parDoOutput2Node));
    assertEquals(parDoMultiOutput1, ((MultiOutputInfoEdge) Iterables.getOnlyElement(network.edgesConnecting(parDoNode, parDoOutput1Node))).getMultiOutputInfo());
    assertEquals(parDoMultiOutput2, ((MultiOutputInfoEdge) Iterables.getOnlyElement(network.edgesConnecting(parDoNode, parDoOutput2Node))).getMultiOutputInfo());
}
Also used : MultiOutputInfo(com.google.api.services.dataflow.model.MultiOutputInfo) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) Node(org.apache.beam.runners.dataflow.worker.graph.Nodes.Node) InstructionOutput(com.google.api.services.dataflow.model.InstructionOutput) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) ReadInstruction(com.google.api.services.dataflow.model.ReadInstruction) ParallelInstruction(com.google.api.services.dataflow.model.ParallelInstruction) ParDoInstruction(com.google.api.services.dataflow.model.ParDoInstruction) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) MapTask(com.google.api.services.dataflow.model.MapTask) WriteInstruction(com.google.api.services.dataflow.model.WriteInstruction) Edge(org.apache.beam.runners.dataflow.worker.graph.Edges.Edge) DefaultEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.DefaultEdge) MultiOutputInfoEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.MultiOutputInfoEdge) Test(org.junit.Test)

Example 4 with InstructionOutputNode

use of org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode in project beam by apache.

the class MapTaskToNetworkFunctionTest method testRead.

@Test
public void testRead() {
    InstructionOutput readOutput = createInstructionOutput("Read.out");
    ParallelInstruction read = createParallelInstruction("Read", readOutput);
    read.setRead(new ReadInstruction());
    MapTask mapTask = new MapTask();
    mapTask.setInstructions(ImmutableList.of(read));
    mapTask.setFactory(Transport.getJsonFactory());
    Network<Node, Edge> network = new MapTaskToNetworkFunction(IdGenerators.decrementingLongs()).apply(mapTask);
    assertNetworkProperties(network);
    assertEquals(2, network.nodes().size());
    assertEquals(1, network.edges().size());
    ParallelInstructionNode readNode = get(network, read);
    InstructionOutputNode readOutputNode = getOnlySuccessor(network, readNode);
    assertEquals(readOutput, readOutputNode.getInstructionOutput());
}
Also used : ParallelInstruction(com.google.api.services.dataflow.model.ParallelInstruction) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) MapTask(com.google.api.services.dataflow.model.MapTask) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) Node(org.apache.beam.runners.dataflow.worker.graph.Nodes.Node) InstructionOutput(com.google.api.services.dataflow.model.InstructionOutput) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) ReadInstruction(com.google.api.services.dataflow.model.ReadInstruction) Edge(org.apache.beam.runners.dataflow.worker.graph.Edges.Edge) DefaultEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.DefaultEdge) MultiOutputInfoEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.MultiOutputInfoEdge) Test(org.junit.Test)

Example 5 with InstructionOutputNode

use of org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode in project beam by apache.

the class MapTaskToNetworkFunctionTest method testParallelEdgeFlatten.

@Test
public void testParallelEdgeFlatten() {
    // /---\
    // Read --> Read.out --> Flatten
    // \---/
    InstructionOutput readOutput = createInstructionOutput("Read.out");
    ParallelInstruction read = createParallelInstruction("Read", readOutput);
    read.setRead(new ReadInstruction());
    FlattenInstruction flattenInstruction = new FlattenInstruction();
    flattenInstruction.setInputs(ImmutableList.of(// Read.out
    createInstructionInput(0, 0), // Read.out
    createInstructionInput(0, 0), // Read.out
    createInstructionInput(0, 0)));
    InstructionOutput flattenOutput = createInstructionOutput("Flatten.out");
    ParallelInstruction flatten = createParallelInstruction("Flatten", flattenOutput);
    flatten.setFlatten(flattenInstruction);
    MapTask mapTask = new MapTask();
    mapTask.setInstructions(ImmutableList.of(read, flatten));
    mapTask.setFactory(Transport.getJsonFactory());
    Network<Node, Edge> network = new MapTaskToNetworkFunction(IdGenerators.decrementingLongs()).apply(mapTask);
    assertNetworkProperties(network);
    assertEquals(4, network.nodes().size());
    assertEquals(5, network.edges().size());
    ParallelInstructionNode readNode = get(network, read);
    InstructionOutputNode readOutputNode = getOnlySuccessor(network, readNode);
    assertEquals(readOutput, readOutputNode.getInstructionOutput());
    ParallelInstructionNode flattenNode = getOnlySuccessor(network, readOutputNode);
    // Assert that the three parallel edges are maintained
    assertEquals(3, network.edgesConnecting(readOutputNode, flattenNode).size());
    InstructionOutputNode flattenOutputNode = getOnlySuccessor(network, flattenNode);
    assertEquals(flattenOutput, flattenOutputNode.getInstructionOutput());
}
Also used : ParallelInstruction(com.google.api.services.dataflow.model.ParallelInstruction) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) MapTask(com.google.api.services.dataflow.model.MapTask) InstructionOutputNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) Node(org.apache.beam.runners.dataflow.worker.graph.Nodes.Node) InstructionOutput(com.google.api.services.dataflow.model.InstructionOutput) ParallelInstructionNode(org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode) ReadInstruction(com.google.api.services.dataflow.model.ReadInstruction) FlattenInstruction(com.google.api.services.dataflow.model.FlattenInstruction) Edge(org.apache.beam.runners.dataflow.worker.graph.Edges.Edge) DefaultEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.DefaultEdge) MultiOutputInfoEdge(org.apache.beam.runners.dataflow.worker.graph.Edges.MultiOutputInfoEdge) Test(org.junit.Test)

Aggregations

InstructionOutputNode (org.apache.beam.runners.dataflow.worker.graph.Nodes.InstructionOutputNode)21 Edge (org.apache.beam.runners.dataflow.worker.graph.Edges.Edge)19 Node (org.apache.beam.runners.dataflow.worker.graph.Nodes.Node)19 ParallelInstructionNode (org.apache.beam.runners.dataflow.worker.graph.Nodes.ParallelInstructionNode)19 DefaultEdge (org.apache.beam.runners.dataflow.worker.graph.Edges.DefaultEdge)14 InstructionOutput (com.google.api.services.dataflow.model.InstructionOutput)13 MultiOutputInfoEdge (org.apache.beam.runners.dataflow.worker.graph.Edges.MultiOutputInfoEdge)13 ParallelInstruction (com.google.api.services.dataflow.model.ParallelInstruction)11 Test (org.junit.Test)10 ReadInstruction (com.google.api.services.dataflow.model.ReadInstruction)9 MapTask (com.google.api.services.dataflow.model.MapTask)8 MultiOutputInfo (com.google.api.services.dataflow.model.MultiOutputInfo)5 ParDoInstruction (com.google.api.services.dataflow.model.ParDoInstruction)5 RunnerApi (org.apache.beam.model.pipeline.v1.RunnerApi)5 RemoteGrpcPortNode (org.apache.beam.runners.dataflow.worker.graph.Nodes.RemoteGrpcPortNode)5 CloudObject (org.apache.beam.runners.dataflow.util.CloudObject)4 ImmutableMap (org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap)4 WriteInstruction (com.google.api.services.dataflow.model.WriteInstruction)3 IOException (java.io.IOException)3 ArrayList (java.util.ArrayList)3