Search in sources :

Example 1 with FetchConverter

use of org.apache.hadoop.hive.common.io.FetchConverter in project hive by apache.

the class PreExecutePrinter method run.

@Override
public void run(HookContext hookContext) throws Exception {
    assert (hookContext.getHookType() == HookType.PRE_EXEC_HOOK);
    SessionState ss = SessionState.get();
    QueryState queryState = hookContext.getQueryState();
    if (ss != null && ss.out instanceof FetchConverter) {
        boolean foundQuery = queryState.getHiveOperation() == HiveOperation.QUERY && !hookContext.getQueryPlan().isForExplain();
        ((FetchConverter) ss.out).foundQuery(foundQuery);
    }
    Set<ReadEntity> inputs = hookContext.getInputs();
    Set<WriteEntity> outputs = hookContext.getOutputs();
    UserGroupInformation ugi = hookContext.getUgi();
    this.run(queryState, inputs, outputs, ugi);
}
Also used : FetchConverter(org.apache.hadoop.hive.common.io.FetchConverter) SessionState(org.apache.hadoop.hive.ql.session.SessionState) QueryState(org.apache.hadoop.hive.ql.QueryState) UserGroupInformation(org.apache.hadoop.security.UserGroupInformation)

Example 2 with FetchConverter

use of org.apache.hadoop.hive.common.io.FetchConverter in project hive by apache.

the class CliDriver method processLocalCmd.

int processLocalCmd(String cmd, CommandProcessor proc, CliSessionState ss) {
    int tryCount = 0;
    boolean needRetry;
    int ret = 0;
    do {
        try {
            needRetry = false;
            if (proc != null) {
                if (proc instanceof Driver) {
                    Driver qp = (Driver) proc;
                    PrintStream out = ss.out;
                    long start = System.currentTimeMillis();
                    if (ss.getIsVerbose()) {
                        out.println(cmd);
                    }
                    qp.setTryCount(tryCount);
                    ret = qp.run(cmd).getResponseCode();
                    if (ret != 0) {
                        qp.close();
                        return ret;
                    }
                    // query has run capture the time
                    long end = System.currentTimeMillis();
                    double timeTaken = (end - start) / 1000.0;
                    ArrayList<String> res = new ArrayList<String>();
                    printHeader(qp, out);
                    // print the results
                    int counter = 0;
                    try {
                        if (out instanceof FetchConverter) {
                            ((FetchConverter) out).fetchStarted();
                        }
                        while (qp.getResults(res)) {
                            for (String r : res) {
                                out.println(r);
                            }
                            counter += res.size();
                            res.clear();
                            if (out.checkError()) {
                                break;
                            }
                        }
                    } catch (IOException e) {
                        console.printError("Failed with exception " + e.getClass().getName() + ":" + e.getMessage(), "\n" + org.apache.hadoop.util.StringUtils.stringifyException(e));
                        ret = 1;
                    }
                    int cret = qp.close();
                    if (ret == 0) {
                        ret = cret;
                    }
                    if (out instanceof FetchConverter) {
                        ((FetchConverter) out).fetchFinished();
                    }
                    console.printInfo("Time taken: " + timeTaken + " seconds" + (counter == 0 ? "" : ", Fetched: " + counter + " row(s)"));
                } else {
                    String firstToken = tokenizeCmd(cmd.trim())[0];
                    String cmd_1 = getFirstCmd(cmd.trim(), firstToken.length());
                    if (ss.getIsVerbose()) {
                        ss.out.println(firstToken + " " + cmd_1);
                    }
                    CommandProcessorResponse res = proc.run(cmd_1);
                    if (res.getResponseCode() != 0) {
                        ss.out.println("Query returned non-zero code: " + res.getResponseCode() + ", cause: " + res.getErrorMessage());
                    }
                    if (res.getConsoleMessages() != null) {
                        for (String consoleMsg : res.getConsoleMessages()) {
                            console.printInfo(consoleMsg);
                        }
                    }
                    ret = res.getResponseCode();
                }
            }
        } catch (CommandNeedRetryException e) {
            console.printInfo("Retry query with a different approach...");
            tryCount++;
            needRetry = true;
        }
    } while (needRetry);
    return ret;
}
Also used : CachingPrintStream(org.apache.hadoop.hive.common.io.CachingPrintStream) PrintStream(java.io.PrintStream) CommandProcessorResponse(org.apache.hadoop.hive.ql.processors.CommandProcessorResponse) ArrayList(java.util.ArrayList) Driver(org.apache.hadoop.hive.ql.Driver) IOException(java.io.IOException) FetchConverter(org.apache.hadoop.hive.common.io.FetchConverter) CommandNeedRetryException(org.apache.hadoop.hive.ql.CommandNeedRetryException)

Aggregations

FetchConverter (org.apache.hadoop.hive.common.io.FetchConverter)2 IOException (java.io.IOException)1 PrintStream (java.io.PrintStream)1 ArrayList (java.util.ArrayList)1 CachingPrintStream (org.apache.hadoop.hive.common.io.CachingPrintStream)1 CommandNeedRetryException (org.apache.hadoop.hive.ql.CommandNeedRetryException)1 Driver (org.apache.hadoop.hive.ql.Driver)1 QueryState (org.apache.hadoop.hive.ql.QueryState)1 CommandProcessorResponse (org.apache.hadoop.hive.ql.processors.CommandProcessorResponse)1 SessionState (org.apache.hadoop.hive.ql.session.SessionState)1 UserGroupInformation (org.apache.hadoop.security.UserGroupInformation)1