Search in sources :

Example 1 with ExportWork

use of org.apache.hadoop.hive.ql.plan.ExportWork in project hive by apache.

the class ExportSemanticAnalyzer method analyzeInternal.

@Override
public void analyzeInternal(ASTNode ast) throws SemanticException {
    Tree tableTree = ast.getChild(0);
    Tree toTree = ast.getChild(1);
    ReplicationSpec replicationSpec;
    if (ast.getChildCount() > 2) {
        // Replication case: export table <tbl> to <location> for replication
        replicationSpec = new ReplicationSpec((ASTNode) ast.getChild(2));
    } else {
        // Export case
        replicationSpec = new ReplicationSpec();
    }
    if (replicationSpec.getCurrentReplicationState() == null) {
        try {
            long currentEventId = db.getMSC().getCurrentNotificationEventId().getEventId();
            replicationSpec.setCurrentReplicationState(String.valueOf(currentEventId));
        } catch (Exception e) {
            throw new SemanticException("Error when getting current notification event ID", e);
        }
    }
    // initialize source table/partition
    TableSpec ts;
    try {
        ts = new TableSpec(db, conf, (ASTNode) tableTree, false, true);
    } catch (SemanticException sme) {
        if ((replicationSpec.isInReplicationScope()) && ((sme.getCause() instanceof InvalidTableException) || (sme instanceof Table.ValidationFailureSemanticException))) {
            // If we're in replication scope, it's possible that we're running the export long after
            // the table was dropped, so the table not existing currently or being a different kind of
            // table is not an error - it simply means we should no-op, and let a future export
            // capture the appropriate state
            ts = null;
        } else {
            throw sme;
        }
    }
    // initialize export path
    String tmpPath = stripQuotes(toTree.getText());
    // All parsing is done, we're now good to start the export process
    TableExport.Paths exportPaths = new TableExport.Paths(ErrorMsg.INVALID_PATH.getMsg(ast), tmpPath, conf, false);
    TableExport tableExport = new TableExport(exportPaths, ts, replicationSpec, db, null, conf);
    TableExport.AuthEntities authEntities = tableExport.getAuthEntities();
    inputs.addAll(authEntities.inputs);
    outputs.addAll(authEntities.outputs);
    String exportRootDirName = tmpPath;
    // Configure export work
    ExportWork exportWork = new ExportWork(exportRootDirName, ts, replicationSpec, ErrorMsg.INVALID_PATH.getMsg(ast));
    // Create an export task and add it as a root task
    Task<ExportWork> exportTask = TaskFactory.get(exportWork);
    rootTasks.add(exportTask);
}
Also used : Table(org.apache.hadoop.hive.ql.metadata.Table) InvalidTableException(org.apache.hadoop.hive.ql.metadata.InvalidTableException) ExportWork(org.apache.hadoop.hive.ql.plan.ExportWork) TableExport(org.apache.hadoop.hive.ql.parse.repl.dump.TableExport) InvalidTableException(org.apache.hadoop.hive.ql.metadata.InvalidTableException) Tree(org.antlr.runtime.tree.Tree)

Aggregations

Tree (org.antlr.runtime.tree.Tree)1 InvalidTableException (org.apache.hadoop.hive.ql.metadata.InvalidTableException)1 Table (org.apache.hadoop.hive.ql.metadata.Table)1 TableExport (org.apache.hadoop.hive.ql.parse.repl.dump.TableExport)1 ExportWork (org.apache.hadoop.hive.ql.plan.ExportWork)1