Search in sources :

Example 6 with DDLTask

use of org.apache.hadoop.hive.ql.ddl.DDLTask in project hive by apache.

the class TestHiveDecimalParse method getColumnType.

private String getColumnType(String query) {
    Driver driver = createDriver();
    int rc = driver.compile(query, true);
    if (rc != 0) {
        return null;
    }
    QueryPlan plan = driver.getPlan();
    DDLTask task = (DDLTask) plan.getRootTasks().get(0);
    DDLWork work = task.getWork();
    CreateTableDesc spec = (CreateTableDesc) work.getDDLDesc();
    FieldSchema fs = spec.getCols().get(0);
    return fs.getType();
}
Also used : CreateTableDesc(org.apache.hadoop.hive.ql.ddl.table.create.CreateTableDesc) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) DDLTask(org.apache.hadoop.hive.ql.ddl.DDLTask) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) Driver(org.apache.hadoop.hive.ql.Driver) QueryPlan(org.apache.hadoop.hive.ql.QueryPlan)

Example 7 with DDLTask

use of org.apache.hadoop.hive.ql.ddl.DDLTask in project hive by apache.

the class DummySemanticAnalyzerHook1 method postAnalyze.

@Override
public void postAnalyze(HiveSemanticAnalyzerHookContext context, List<Task<?>> rootTasks) throws SemanticException {
    count = 0;
    if (!isCreateTable) {
        return;
    }
    CreateTableDesc desc = (CreateTableDesc) ((DDLTask) rootTasks.get(rootTasks.size() - 1)).getWork().getDDLDesc();
    Map<String, String> tblProps = desc.getTblProps();
    if (tblProps == null) {
        tblProps = new HashMap<String, String>();
    }
    tblProps.put("createdBy", DummyCreateTableHook.class.getName());
    tblProps.put("Message", "Hive rocks!! Count: " + myCount);
    LogHelper console = SessionState.getConsole();
    console.printError("DummySemanticAnalyzerHook1 Post: Hive rocks!! Count: " + myCount);
}
Also used : CreateTableDesc(org.apache.hadoop.hive.ql.ddl.table.create.CreateTableDesc) DDLTask(org.apache.hadoop.hive.ql.ddl.DDLTask) LogHelper(org.apache.hadoop.hive.ql.session.SessionState.LogHelper)

Aggregations

DDLTask (org.apache.hadoop.hive.ql.ddl.DDLTask)7 CreateTableDesc (org.apache.hadoop.hive.ql.ddl.table.create.CreateTableDesc)4 HashMap (java.util.HashMap)3 Path (org.apache.hadoop.fs.Path)3 DDLWork (org.apache.hadoop.hive.ql.ddl.DDLWork)3 HashSet (java.util.HashSet)2 DDLDesc (org.apache.hadoop.hive.ql.ddl.DDLDesc)2 Task (org.apache.hadoop.hive.ql.exec.Task)2 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)2 Table (org.apache.hadoop.hive.ql.metadata.Table)2 IOException (java.io.IOException)1 ArrayList (java.util.ArrayList)1 LinkedHashMap (java.util.LinkedHashMap)1 LinkedHashSet (java.util.LinkedHashSet)1 List (java.util.List)1 Map (java.util.Map)1 Set (java.util.Set)1 TableName (org.apache.hadoop.hive.common.TableName)1 HiveConf (org.apache.hadoop.hive.conf.HiveConf)1 FieldSchema (org.apache.hadoop.hive.metastore.api.FieldSchema)1