Search in sources :

Example 1 with Value

use of co.cask.cdap.api.workflow.Value in project cdap by caskdata.

the class StreamToDataset method initialize.

@Override
public void initialize() throws Exception {
    MapReduceContext context = getContext();
    Job job = context.getHadoopJob();
    job.setNumReduceTasks(0);
    WorkflowToken workflowToken = context.getWorkflowToken();
    Class<? extends Mapper> mapper = PageTitleToDatasetMapper.class;
    String inputStream = WikipediaPipelineApp.PAGE_TITLES_STREAM;
    String outputDataset = WikipediaPipelineApp.PAGE_TITLES_DATASET;
    if (workflowToken != null) {
        Value likesToDatasetResult = workflowToken.get("result", WikipediaPipelineApp.LIKES_TO_DATASET_MR_NAME);
        if (likesToDatasetResult != null && likesToDatasetResult.getAsBoolean()) {
            // The "likes" stream to the dataset has already run and has been successful in this run so far.
            // Now run raw wikipedia stream to dataset.
            mapper = RawWikiDataToDatasetMapper.class;
            inputStream = WikipediaPipelineApp.RAW_WIKIPEDIA_STREAM;
            outputDataset = WikipediaPipelineApp.RAW_WIKIPEDIA_DATASET;
        }
    }
    LOG.info("Using '{}' as the input stream and '{}' as the output dataset.", inputStream, outputDataset);
    job.setMapperClass(mapper);
    String dataNamespace = context.getRuntimeArguments().get(WikipediaPipelineApp.NAMESPACE_ARG);
    dataNamespace = dataNamespace == null ? getContext().getNamespace() : dataNamespace;
    context.addInput(Input.ofStream(inputStream).fromNamespace(dataNamespace));
    context.addOutput(Output.ofDataset(outputDataset).fromNamespace(dataNamespace));
}
Also used : MapReduceContext(co.cask.cdap.api.mapreduce.MapReduceContext) Value(co.cask.cdap.api.workflow.Value) WorkflowToken(co.cask.cdap.api.workflow.WorkflowToken) Job(org.apache.hadoop.mapreduce.Job)

Example 2 with Value

use of co.cask.cdap.api.workflow.Value in project cdap by caskdata.

the class SparkExecutionServiceTest method testWorkflowToken.

@Test
public void testWorkflowToken() throws Exception {
    ProgramRunId programRunId = new ProgramRunId("ns", "app", ProgramType.SPARK, "test", RunIds.generate().getId());
    // Start a service with empty workflow token
    BasicWorkflowToken token = new BasicWorkflowToken(10);
    token.setCurrentNode("spark");
    SparkExecutionService service = new SparkExecutionService(locationFactory, InetAddress.getLoopbackAddress().getCanonicalHostName(), programRunId, token);
    service.startAndWait();
    try {
        SparkExecutionClient client = new SparkExecutionClient(service.getBaseURI(), programRunId);
        // Update token via heartbeat
        BasicWorkflowToken clientToken = new BasicWorkflowToken(10);
        clientToken.setCurrentNode("spark");
        for (int i = 0; i < 5; i++) {
            clientToken.put("key", "value" + i);
            client.heartbeat(clientToken);
            // The server side token should get updated
            Assert.assertEquals(Value.of("value" + i), token.get("key", "spark"));
        }
        clientToken.put("completed", "true");
        client.completed(clientToken);
    } finally {
        service.stopAndWait();
    }
    // The token on the service side should get updated after the completed call.
    Map<String, Value> values = token.getAllFromNode("spark");
    Map<String, Value> expected = ImmutableMap.of("key", Value.of("value4"), "completed", Value.of("true"));
    Assert.assertEquals(expected, values);
}
Also used : Value(co.cask.cdap.api.workflow.Value) BasicWorkflowToken(co.cask.cdap.internal.app.runtime.workflow.BasicWorkflowToken) ProgramRunId(co.cask.cdap.proto.id.ProgramRunId) Test(org.junit.Test)

Aggregations

Value (co.cask.cdap.api.workflow.Value)2 MapReduceContext (co.cask.cdap.api.mapreduce.MapReduceContext)1 WorkflowToken (co.cask.cdap.api.workflow.WorkflowToken)1 BasicWorkflowToken (co.cask.cdap.internal.app.runtime.workflow.BasicWorkflowToken)1 ProgramRunId (co.cask.cdap.proto.id.ProgramRunId)1 Job (org.apache.hadoop.mapreduce.Job)1 Test (org.junit.Test)1