Search in sources :

Example 1 with HadoopDummyProgressable

use of org.apache.flink.api.java.hadoop.mapred.wrapper.HadoopDummyProgressable in project flink by apache.

the class HadoopOutputFormatBase method open.

/**
 * create the temporary output file for hadoop RecordWriter.
 *
 * @param taskNumber The number of the parallel instance.
 * @param numTasks The number of parallel tasks.
 * @throws java.io.IOException
 */
@Override
public void open(int taskNumber, int numTasks) throws IOException {
    // enforce sequential open() calls
    synchronized (OPEN_MUTEX) {
        if (Integer.toString(taskNumber + 1).length() > 6) {
            throw new IOException("Task id too large.");
        }
        TaskAttemptID taskAttemptID = TaskAttemptID.forName("attempt__0000_r_" + String.format("%" + (6 - Integer.toString(taskNumber + 1).length()) + "s", " ").replace(" ", "0") + Integer.toString(taskNumber + 1) + "_0");
        this.jobConf.set("mapred.task.id", taskAttemptID.toString());
        this.jobConf.setInt("mapred.task.partition", taskNumber + 1);
        // for hadoop 2.2
        this.jobConf.set("mapreduce.task.attempt.id", taskAttemptID.toString());
        this.jobConf.setInt("mapreduce.task.partition", taskNumber + 1);
        this.context = new TaskAttemptContextImpl(this.jobConf, taskAttemptID);
        this.outputCommitter = this.jobConf.getOutputCommitter();
        JobContext jobContext = new JobContextImpl(this.jobConf, new JobID());
        this.outputCommitter.setupJob(jobContext);
        this.recordWriter = this.mapredOutputFormat.getRecordWriter(null, this.jobConf, Integer.toString(taskNumber + 1), new HadoopDummyProgressable());
    }
}
Also used : JobContextImpl(org.apache.hadoop.mapred.JobContextImpl) TaskAttemptID(org.apache.hadoop.mapred.TaskAttemptID) TaskAttemptContextImpl(org.apache.hadoop.mapred.TaskAttemptContextImpl) IOException(java.io.IOException) JobContext(org.apache.hadoop.mapred.JobContext) HadoopDummyProgressable(org.apache.flink.api.java.hadoop.mapred.wrapper.HadoopDummyProgressable) JobID(org.apache.hadoop.mapred.JobID)

Aggregations

IOException (java.io.IOException)1 HadoopDummyProgressable (org.apache.flink.api.java.hadoop.mapred.wrapper.HadoopDummyProgressable)1 JobContext (org.apache.hadoop.mapred.JobContext)1 JobContextImpl (org.apache.hadoop.mapred.JobContextImpl)1 JobID (org.apache.hadoop.mapred.JobID)1 TaskAttemptContextImpl (org.apache.hadoop.mapred.TaskAttemptContextImpl)1 TaskAttemptID (org.apache.hadoop.mapred.TaskAttemptID)1