Search in sources :

Example 1 with SparkExecutionPluginContext

use of io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext in project cdap by caskdata.

the class BaseRDDCollection method createStoreTask.

@Override
public Runnable createStoreTask(final StageSpec stageSpec, final SparkSink<T> sink) throws Exception {
    return new Runnable() {

        @Override
        public void run() {
            String stageName = stageSpec.getName();
            PipelineRuntime pipelineRuntime = new SparkPipelineRuntime(sec);
            SparkExecutionPluginContext sparkPluginContext = new BasicSparkExecutionPluginContext(sec, jsc, datasetContext, pipelineRuntime, stageSpec);
            JavaRDD<T> countedRDD = rdd.map(new CountingFunction<T>(stageName, sec.getMetrics(), Constants.Metrics.RECORDS_IN, null));
            SparkConf sparkConf = jsc.getConf();
            try {
                sink.run(sparkPluginContext, countedRDD);
            } catch (Exception e) {
                throw Throwables.propagate(e);
            }
        }
    };
}
Also used : SparkExecutionPluginContext(io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext) PipelineRuntime(io.cdap.cdap.etl.common.PipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) SparkConf(org.apache.spark.SparkConf) AccessException(io.cdap.cdap.api.security.AccessException) DatasetManagementException(io.cdap.cdap.api.dataset.DatasetManagementException)

Example 2 with SparkExecutionPluginContext

use of io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext in project cdap by caskdata.

the class BaseRDDCollection method compute.

@Override
public <U> SparkCollection<U> compute(StageSpec stageSpec, SparkCompute<T, U> compute) throws Exception {
    String stageName = stageSpec.getName();
    PipelineRuntime pipelineRuntime = new SparkPipelineRuntime(sec);
    SparkExecutionPluginContext sparkPluginContext = new BasicSparkExecutionPluginContext(sec, jsc, datasetContext, pipelineRuntime, stageSpec);
    compute.initialize(sparkPluginContext);
    JavaRDD<T> countedInput = rdd.map(new CountingFunction<T>(stageName, sec.getMetrics(), Constants.Metrics.RECORDS_IN, null));
    SparkConf sparkConf = jsc.getConf();
    return wrap(compute.transform(sparkPluginContext, countedInput).map(new CountingFunction<U>(stageName, sec.getMetrics(), Constants.Metrics.RECORDS_OUT, sec.getDataTracer(stageName))));
}
Also used : SparkExecutionPluginContext(io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext) PipelineRuntime(io.cdap.cdap.etl.common.PipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) CountingFunction(io.cdap.cdap.etl.spark.function.CountingFunction) SparkConf(org.apache.spark.SparkConf)

Example 3 with SparkExecutionPluginContext

use of io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext in project cdap by caskdata.

the class DStreamCollection method compute.

@Override
public <U> SparkCollection<U> compute(StageSpec stageSpec, SparkCompute<T, U> compute) throws Exception {
    SparkCompute<T, U> wrappedCompute = new DynamicSparkCompute<>(new DynamicDriverContext(stageSpec, sec, new NoopStageStatisticsCollector()), compute);
    Transactionals.execute(sec, new TxRunnable() {

        @Override
        public void run(DatasetContext datasetContext) throws Exception {
            PipelineRuntime pipelineRuntime = new SparkPipelineRuntime(sec);
            SparkExecutionPluginContext sparkPluginContext = new BasicSparkExecutionPluginContext(sec, JavaSparkContext.fromSparkContext(stream.context().sparkContext()), datasetContext, pipelineRuntime, stageSpec);
            wrappedCompute.initialize(sparkPluginContext);
        }
    }, Exception.class);
    return wrap(stream.transform(new ComputeTransformFunction<>(sec, stageSpec, wrappedCompute)));
}
Also used : DynamicSparkCompute(io.cdap.cdap.etl.spark.streaming.function.DynamicSparkCompute) NoopStageStatisticsCollector(io.cdap.cdap.etl.common.NoopStageStatisticsCollector) ComputeTransformFunction(io.cdap.cdap.etl.spark.streaming.function.ComputeTransformFunction) PipelineRuntime(io.cdap.cdap.etl.common.PipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) BasicSparkExecutionPluginContext(io.cdap.cdap.etl.spark.batch.BasicSparkExecutionPluginContext) BasicSparkExecutionPluginContext(io.cdap.cdap.etl.spark.batch.BasicSparkExecutionPluginContext) SparkExecutionPluginContext(io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext) TxRunnable(io.cdap.cdap.api.TxRunnable) DatasetContext(io.cdap.cdap.api.data.DatasetContext)

Example 4 with SparkExecutionPluginContext

use of io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext in project cdap by caskdata.

the class ComputeTransformFunction method call.

@Override
public JavaRDD<U> call(JavaRDD<T> data, Time batchTime) throws Exception {
    SparkExecutionPluginContext sparkPluginContext = new SparkStreamingExecutionContext(sec, JavaSparkContext.fromSparkContext(data.context()), batchTime.milliseconds(), stageSpec);
    String stageName = stageSpec.getName();
    data = data.map(new CountingFunction<T>(stageName, sec.getMetrics(), "records.in", null));
    return compute.transform(sparkPluginContext, data).map(new CountingFunction<U>(stageName, sec.getMetrics(), "records.out", sec.getDataTracer(stageName)));
}
Also used : SparkExecutionPluginContext(io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext) SparkStreamingExecutionContext(io.cdap.cdap.etl.spark.streaming.SparkStreamingExecutionContext) CountingFunction(io.cdap.cdap.etl.spark.function.CountingFunction)

Example 5 with SparkExecutionPluginContext

use of io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext in project cdap by caskdata.

the class DynamicSparkCompute method lazyInit.

// when checkpointing is enabled, and Spark is loading DStream operations from an existing checkpoint,
// delegate will be null and the initialize() method won't have been called. So we need to instantiate
// the delegate and initialize it.
private void lazyInit(final JavaSparkContext jsc) throws Exception {
    if (delegate == null) {
        PluginFunctionContext pluginFunctionContext = dynamicDriverContext.getPluginFunctionContext();
        delegate = pluginFunctionContext.createPlugin();
        final StageSpec stageSpec = pluginFunctionContext.getStageSpec();
        final JavaSparkExecutionContext sec = dynamicDriverContext.getSparkExecutionContext();
        Transactionals.execute(sec, new TxRunnable() {

            @Override
            public void run(DatasetContext datasetContext) throws Exception {
                PipelineRuntime pipelineRuntime = new SparkPipelineRuntime(sec);
                SparkExecutionPluginContext sparkPluginContext = new BasicSparkExecutionPluginContext(sec, jsc, datasetContext, pipelineRuntime, stageSpec);
                delegate.initialize(sparkPluginContext);
            }
        }, Exception.class);
    }
}
Also used : BasicSparkExecutionPluginContext(io.cdap.cdap.etl.spark.batch.BasicSparkExecutionPluginContext) PluginFunctionContext(io.cdap.cdap.etl.spark.function.PluginFunctionContext) BasicSparkExecutionPluginContext(io.cdap.cdap.etl.spark.batch.BasicSparkExecutionPluginContext) SparkExecutionPluginContext(io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) PipelineRuntime(io.cdap.cdap.etl.common.PipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) TxRunnable(io.cdap.cdap.api.TxRunnable) StageSpec(io.cdap.cdap.etl.proto.v2.spec.StageSpec) JavaSparkExecutionContext(io.cdap.cdap.api.spark.JavaSparkExecutionContext) DatasetContext(io.cdap.cdap.api.data.DatasetContext)

Aggregations

SparkExecutionPluginContext (io.cdap.cdap.etl.api.batch.SparkExecutionPluginContext)6 PipelineRuntime (io.cdap.cdap.etl.common.PipelineRuntime)5 SparkPipelineRuntime (io.cdap.cdap.etl.spark.SparkPipelineRuntime)5 TxRunnable (io.cdap.cdap.api.TxRunnable)3 DatasetContext (io.cdap.cdap.api.data.DatasetContext)3 CountingFunction (io.cdap.cdap.etl.spark.function.CountingFunction)3 BasicSparkExecutionPluginContext (io.cdap.cdap.etl.spark.batch.BasicSparkExecutionPluginContext)2 SparkStreamingExecutionContext (io.cdap.cdap.etl.spark.streaming.SparkStreamingExecutionContext)2 SparkConf (org.apache.spark.SparkConf)2 DatasetManagementException (io.cdap.cdap.api.dataset.DatasetManagementException)1 MacroEvaluator (io.cdap.cdap.api.macro.MacroEvaluator)1 PluginContext (io.cdap.cdap.api.plugin.PluginContext)1 AccessException (io.cdap.cdap.api.security.AccessException)1 JavaSparkExecutionContext (io.cdap.cdap.api.spark.JavaSparkExecutionContext)1 SparkPluginContext (io.cdap.cdap.etl.api.batch.SparkPluginContext)1 BasicArguments (io.cdap.cdap.etl.common.BasicArguments)1 DefaultMacroEvaluator (io.cdap.cdap.etl.common.DefaultMacroEvaluator)1 NoopStageStatisticsCollector (io.cdap.cdap.etl.common.NoopStageStatisticsCollector)1 StageSpec (io.cdap.cdap.etl.proto.v2.spec.StageSpec)1 BasicSparkPluginContext (io.cdap.cdap.etl.spark.batch.BasicSparkPluginContext)1