Search in sources :

Example 96 with Context

use of org.apache.hadoop.hive.ql.Context in project hive by apache.

the class TestReplicationMetricUpdateOnFailure method testRecoverableDDLFailureWithStageMissing.

@Test
public void testRecoverableDDLFailureWithStageMissing() throws Exception {
    // task-setup for DDL-Task
    DDLWork ddlWork = Mockito.mock(DDLWork.class);
    Context context = Mockito.mock(Context.class);
    Mockito.when(context.getExplainAnalyze()).thenReturn(ExplainConfiguration.AnalyzeState.ANALYZING);
    Mockito.when(ddlWork.isReplication()).thenReturn(true);
    String dumpDir = TEST_PATH + Path.SEPARATOR + testName.getMethodName();
    Mockito.when(ddlWork.getDumpDirectory()).thenReturn(dumpDir);
    Task<DDLWork> ddlTask = TaskFactory.get(ddlWork, conf);
    ddlTask.initialize(null, null, null, context);
    MetricCollector.getInstance().deinit();
    IncrementalLoadMetricCollector metricCollector = new IncrementalLoadMetricCollector(null, TEST_PATH, 1, conf);
    // ensure stages are missing initially and execute without reporting start metrics
    Assert.assertEquals(0, MetricCollector.getInstance().getMetrics().size());
    Map<String, Long> metricMap = new HashMap<>();
    Mockito.when(ddlWork.getMetricCollector()).thenReturn(metricCollector);
    Mockito.when(ddlWork.getDDLDesc()).thenThrow(recoverableException);
    // test recoverable error during DDL-Task
    ddlTask.execute();
    performRecoverableChecks("REPL_LOAD");
}
Also used : Context(org.apache.hadoop.hive.ql.Context) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) HashMap(java.util.HashMap) IncrementalLoadMetricCollector(org.apache.hadoop.hive.ql.parse.repl.load.metric.IncrementalLoadMetricCollector) Test(org.junit.Test)

Example 97 with Context

use of org.apache.hadoop.hive.ql.Context in project hive by apache.

the class TestReplicationMetricUpdateOnFailure method testNonRecoverableDDLFailureWithStageMissing.

@Test
public void testNonRecoverableDDLFailureWithStageMissing() throws Exception {
    // task-setup for DDL-Task
    DDLWork ddlWork = Mockito.mock(DDLWork.class);
    Context context = Mockito.mock(Context.class);
    Mockito.when(context.getExplainAnalyze()).thenReturn(ExplainConfiguration.AnalyzeState.ANALYZING);
    Mockito.when(ddlWork.isReplication()).thenReturn(true);
    String dumpDir = TEST_PATH + Path.SEPARATOR + testName.getMethodName();
    Mockito.when(ddlWork.getDumpDirectory()).thenReturn(dumpDir);
    Task<DDLWork> ddlTask = TaskFactory.get(ddlWork, conf);
    ddlTask.initialize(null, null, null, context);
    MetricCollector.getInstance().deinit();
    IncrementalLoadMetricCollector metricCollector = new IncrementalLoadMetricCollector(null, TEST_PATH, 1, conf);
    // ensure stages are missing initially and execute without reporting start metrics
    Assert.assertEquals(0, MetricCollector.getInstance().getMetrics().size());
    Map<String, Long> metricMap = new HashMap<>();
    Mockito.when(ddlWork.getMetricCollector()).thenReturn(metricCollector);
    Mockito.when(ddlWork.getDDLDesc()).thenThrow(nonRecoverableException);
    // test non-recoverable error during DDL-Task, without initializing stage
    ddlTask.execute();
    performNonRecoverableChecks(dumpDir, "REPL_LOAD");
}
Also used : Context(org.apache.hadoop.hive.ql.Context) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) HashMap(java.util.HashMap) IncrementalLoadMetricCollector(org.apache.hadoop.hive.ql.parse.repl.load.metric.IncrementalLoadMetricCollector) Test(org.junit.Test)

Example 98 with Context

use of org.apache.hadoop.hive.ql.Context in project hive by apache.

the class TestLineageInfo method before.

@Before
public void before() {
    HiveConf conf = new HiveConf();
    SessionState.start(conf);
    ctx = new Context(conf);
}
Also used : Context(org.apache.hadoop.hive.ql.Context) HiveConf(org.apache.hadoop.hive.conf.HiveConf) Before(org.junit.Before)

Example 99 with Context

use of org.apache.hadoop.hive.ql.Context in project hive by apache.

the class UnlockTableOperation method execute.

@Override
public int execute() throws HiveException {
    Context ctx = context.getContext();
    HiveTxnManager txnManager = ctx.getHiveTxnManager();
    return txnManager.unlockTable(context.getDb(), desc);
}
Also used : DDLOperationContext(org.apache.hadoop.hive.ql.ddl.DDLOperationContext) Context(org.apache.hadoop.hive.ql.Context) HiveTxnManager(org.apache.hadoop.hive.ql.lockmgr.HiveTxnManager)

Example 100 with Context

use of org.apache.hadoop.hive.ql.Context in project hive by apache.

the class TestNullScanTaskDispatcher method setup.

@Before
public void setup() {
    hiveConf = new HiveConf();
    hiveConf.set("fs.mock.impl", MockFileSystem.class.getName());
    hiveConf.setBoolVar(HiveConf.ConfVars.HIVEMETADATAONLYQUERIES, true);
    sessionState = SessionState.start(hiveConf);
    parseContext = spy(new ParseContext());
    context = new Context(hiveConf);
    parseContext.setTopOps(aliasToWork);
    mapWork.setAliasToWork(aliasToWork);
    createReduceWork();
}
Also used : Context(org.apache.hadoop.hive.ql.Context) ParseContext(org.apache.hadoop.hive.ql.parse.ParseContext) CompilationOpContext(org.apache.hadoop.hive.ql.CompilationOpContext) ParseContext(org.apache.hadoop.hive.ql.parse.ParseContext) HiveConf(org.apache.hadoop.hive.conf.HiveConf) MockFileSystem(org.apache.hive.common.util.MockFileSystem) Before(org.junit.Before)

Aggregations

Context (org.apache.hadoop.hive.ql.Context)103 Path (org.apache.hadoop.fs.Path)45 IOException (java.io.IOException)26 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)21 CompilationOpContext (org.apache.hadoop.hive.ql.CompilationOpContext)20 Test (org.junit.Test)19 FileSystem (org.apache.hadoop.fs.FileSystem)16 HiveConf (org.apache.hadoop.hive.conf.HiveConf)16 MapWork (org.apache.hadoop.hive.ql.plan.MapWork)16 DriverContext (org.apache.hadoop.hive.ql.DriverContext)15 HashMap (java.util.HashMap)13 HiveTxnManager (org.apache.hadoop.hive.ql.lockmgr.HiveTxnManager)13 ParseContext (org.apache.hadoop.hive.ql.parse.ParseContext)13 TableDesc (org.apache.hadoop.hive.ql.plan.TableDesc)13 ArrayList (java.util.ArrayList)12 Task (org.apache.hadoop.hive.ql.exec.Task)12 Table (org.apache.hadoop.hive.ql.metadata.Table)12 JobConf (org.apache.hadoop.mapred.JobConf)12 DDLWork (org.apache.hadoop.hive.ql.ddl.DDLWork)9 QueryState (org.apache.hadoop.hive.ql.QueryState)8