Search in sources :

Example 26 with FetchTask

use of org.apache.hadoop.hive.ql.exec.FetchTask in project hive by apache.

the class SimpleFetchOptimizer method transform.

@Override
public ParseContext transform(ParseContext pctx) throws SemanticException {
    Map<String, TableScanOperator> topOps = pctx.getTopOps();
    if (pctx.getQueryProperties().isQuery() && !pctx.getQueryProperties().isAnalyzeCommand() && topOps.size() == 1) {
        // no join, no groupby, no distinct, no lateral view, no subq,
        // no CTAS or insert, not analyze command, and single sourced.
        String alias = (String) pctx.getTopOps().keySet().toArray()[0];
        TableScanOperator topOp = pctx.getTopOps().values().iterator().next();
        try {
            FetchTask fetchTask = optimize(pctx, alias, topOp);
            if (fetchTask != null) {
                pctx.setFetchTask(fetchTask);
            }
        } catch (Exception e) {
            LOG.error("Failed to transform", e);
            if (e instanceof SemanticException) {
                throw (SemanticException) e;
            }
            throw new SemanticException(e.getMessage(), e);
        }
    }
    return pctx;
}
Also used : TableScanOperator(org.apache.hadoop.hive.ql.exec.TableScanOperator) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) FileNotFoundException(java.io.FileNotFoundException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) IOException(java.io.IOException) FetchTask(org.apache.hadoop.hive.ql.exec.FetchTask) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Example 27 with FetchTask

use of org.apache.hadoop.hive.ql.exec.FetchTask in project hive by apache.

the class TestDbTxnManagerIsolationProperties method gapOpenTxnsDirtyRead.

@Test
public void gapOpenTxnsDirtyRead() throws Exception {
    driver.run(("drop table if exists gap"));
    driver.run("create table gap (a int, b int) " + "stored as orc TBLPROPERTIES ('transactional'='true')");
    // Create one TXN to delete later
    driver.compileAndRespond("select * from gap");
    long first = txnMgr.getCurrentTxnId();
    driver.run();
    // The second one we use for Low water mark
    driver.run("select * from gap");
    DbTxnManager txnMgr2 = (DbTxnManager) TxnManagerFactory.getTxnManagerFactory().getTxnManager(conf);
    swapTxnManager(txnMgr2);
    // Now we wait for the time window to move forward
    Thread.sleep(txnHandler.getOpenTxnTimeOutMillis());
    // Create a gap
    deleteTransactionId(first);
    CommandProcessorResponse resp = driver2.compileAndRespond("select * from gap");
    long third = txnMgr2.getCurrentTxnId();
    Assert.assertTrue("Sequence number goes onward", third > first);
    ValidTxnList validTxns = txnMgr2.getValidTxns();
    Assert.assertNull("Expect to see no gap", validTxns.getMinOpenTxn());
    // Now we cheat and create a transaction with the first sequenceId again imitating a very slow openTxns call
    // This should never happen
    setBackSequence(first);
    swapTxnManager(txnMgr);
    driver.compileAndRespond("insert into gap values(1,2)");
    long forth = txnMgr.getCurrentTxnId();
    Assert.assertEquals(first, forth);
    driver.run();
    // Now we run our read query it should unfortunately see the results of the insert
    swapTxnManager(txnMgr2);
    driver2.run();
    FetchTask fetchTask = driver2.getFetchTask();
    List res = new ArrayList();
    fetchTask.fetch(res);
    Assert.assertEquals("Dirty read!", 1, res.size());
}
Also used : CommandProcessorResponse(org.apache.hadoop.hive.ql.processors.CommandProcessorResponse) ValidTxnList(org.apache.hadoop.hive.common.ValidTxnList) ArrayList(java.util.ArrayList) ArrayList(java.util.ArrayList) List(java.util.List) ValidTxnList(org.apache.hadoop.hive.common.ValidTxnList) FetchTask(org.apache.hadoop.hive.ql.exec.FetchTask) Test(org.junit.Test)

Aggregations

FetchTask (org.apache.hadoop.hive.ql.exec.FetchTask)27 ArrayList (java.util.ArrayList)11 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)8 List (java.util.List)7 ValidTxnList (org.apache.hadoop.hive.common.ValidTxnList)7 Path (org.apache.hadoop.fs.Path)6 FetchWork (org.apache.hadoop.hive.ql.plan.FetchWork)6 TableDesc (org.apache.hadoop.hive.ql.plan.TableDesc)6 Test (org.junit.Test)6 IOException (java.io.IOException)5 HiveConf (org.apache.hadoop.hive.conf.HiveConf)5 Context (org.apache.hadoop.hive.ql.Context)4 CacheUsage (org.apache.hadoop.hive.ql.cache.results.CacheUsage)4 FileSinkOperator (org.apache.hadoop.hive.ql.exec.FileSinkOperator)4 TableScanOperator (org.apache.hadoop.hive.ql.exec.TableScanOperator)4 Operator (org.apache.hadoop.hive.ql.exec.Operator)3 Task (org.apache.hadoop.hive.ql.exec.Task)3 CommandProcessorException (org.apache.hadoop.hive.ql.processors.CommandProcessorException)3 LinkedHashMap (java.util.LinkedHashMap)2 LinkedHashSet (java.util.LinkedHashSet)2