Search in sources :

Example 1 with AddDependencyToLeaves

use of org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves in project hive by apache.

the class LoadPartitions method createTableReplLogTask.

private void createTableReplLogTask() throws SemanticException {
    ReplStateLogWork replLogWork = new ReplStateLogWork(replLogger, tableDesc.getTableName(), tableDesc.tableType());
    Task<ReplStateLogWork> replLogTask = TaskFactory.get(replLogWork, context.hiveConf);
    if (tracker.tasks().isEmpty()) {
        tracker.addTask(replLogTask);
    } else {
        DAGTraversal.traverse(tracker.tasks(), new AddDependencyToLeaves(replLogTask));
        List<Task<? extends Serializable>> visited = new ArrayList<>();
        tracker.updateTaskCount(replLogTask, visited);
    }
}
Also used : ReplCopyTask(org.apache.hadoop.hive.ql.exec.ReplCopyTask) Task(org.apache.hadoop.hive.ql.exec.Task) Serializable(java.io.Serializable) ReplStateLogWork(org.apache.hadoop.hive.ql.exec.repl.ReplStateLogWork) ArrayList(java.util.ArrayList) AddDependencyToLeaves(org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves)

Example 2 with AddDependencyToLeaves

use of org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves in project hive by apache.

the class LoadFunction method createFunctionReplLogTask.

private void createFunctionReplLogTask(List<Task<? extends Serializable>> functionTasks, String functionName) {
    ReplStateLogWork replLogWork = new ReplStateLogWork(replLogger, functionName);
    Task<ReplStateLogWork> replLogTask = TaskFactory.get(replLogWork);
    DAGTraversal.traverse(functionTasks, new AddDependencyToLeaves(replLogTask));
}
Also used : ReplStateLogWork(org.apache.hadoop.hive.ql.exec.repl.ReplStateLogWork) AddDependencyToLeaves(org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves)

Example 3 with AddDependencyToLeaves

use of org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves in project hive by apache.

the class LoadTable method createTableReplLogTask.

private void createTableReplLogTask(String tableName, TableType tableType) throws SemanticException {
    ReplStateLogWork replLogWork = new ReplStateLogWork(replLogger, tableName, tableType);
    Task<ReplStateLogWork> replLogTask = TaskFactory.get(replLogWork);
    DAGTraversal.traverse(tracker.tasks(), new AddDependencyToLeaves(replLogTask));
    if (tracker.tasks().isEmpty()) {
        tracker.addTask(replLogTask);
    } else {
        DAGTraversal.traverse(tracker.tasks(), new AddDependencyToLeaves(replLogTask));
        List<Task<? extends Serializable>> visited = new ArrayList<>();
        tracker.updateTaskCount(replLogTask, visited);
    }
}
Also used : ReplCopyTask(org.apache.hadoop.hive.ql.exec.ReplCopyTask) Task(org.apache.hadoop.hive.ql.exec.Task) Serializable(java.io.Serializable) ReplStateLogWork(org.apache.hadoop.hive.ql.exec.repl.ReplStateLogWork) ArrayList(java.util.ArrayList) AddDependencyToLeaves(org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves)

Aggregations

ReplStateLogWork (org.apache.hadoop.hive.ql.exec.repl.ReplStateLogWork)3 AddDependencyToLeaves (org.apache.hadoop.hive.ql.exec.repl.bootstrap.AddDependencyToLeaves)3 Serializable (java.io.Serializable)2 ArrayList (java.util.ArrayList)2 ReplCopyTask (org.apache.hadoop.hive.ql.exec.ReplCopyTask)2 Task (org.apache.hadoop.hive.ql.exec.Task)2