Search in sources :

Example 6 with DDLDesc

use of org.apache.hadoop.hive.ql.ddl.DDLDesc in project hive by apache.

the class AbstractAlterTableArchiveAnalyzer method analyzeCommand.

@Override
protected // the AST tree
void analyzeCommand(TableName tableName, Map<String, String> partSpec, ASTNode command) throws SemanticException {
    if (!conf.getBoolVar(HiveConf.ConfVars.HIVEARCHIVEENABLED)) {
        throw new SemanticException(ErrorMsg.ARCHIVE_METHODS_DISABLED.getMsg());
    }
    Table table = getTable(tableName);
    validateAlterTableType(table, AlterTableType.ARCHIVE, false);
    List<Map<String, String>> partitionSpecs = getPartitionSpecs(table, command);
    if (partitionSpecs.size() > 1) {
        throw new SemanticException(getMultiPartsErrorMessage().getMsg());
    }
    if (partitionSpecs.size() == 0) {
        throw new SemanticException(ErrorMsg.ARCHIVE_ON_TABLE.getMsg());
    }
    Map<String, String> partitionSpec = partitionSpecs.get(0);
    try {
        isValidPrefixSpec(table, partitionSpec);
    } catch (HiveException e) {
        throw new SemanticException(e.getMessage(), e);
    }
    inputs.add(new ReadEntity(table));
    PartitionUtils.addTablePartsOutputs(db, outputs, table, partitionSpecs, true, WriteEntity.WriteType.DDL_NO_LOCK);
    DDLDesc archiveDesc = createDesc(tableName, partitionSpec);
    rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(), archiveDesc)));
}
Also used : ReadEntity(org.apache.hadoop.hive.ql.hooks.ReadEntity) Table(org.apache.hadoop.hive.ql.metadata.Table) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) Map(java.util.Map) DDLDesc(org.apache.hadoop.hive.ql.ddl.DDLDesc) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Aggregations

DDLDesc (org.apache.hadoop.hive.ql.ddl.DDLDesc)6 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)4 DDLWork (org.apache.hadoop.hive.ql.ddl.DDLWork)3 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)3 Table (org.apache.hadoop.hive.ql.metadata.Table)3 Map (java.util.Map)2 Database (org.apache.hadoop.hive.metastore.api.Database)2 DDLTask (org.apache.hadoop.hive.ql.ddl.DDLTask)2 IOException (java.io.IOException)1 HashMap (java.util.HashMap)1 LinkedHashSet (java.util.LinkedHashSet)1 List (java.util.List)1 Path (org.apache.hadoop.fs.Path)1 CreateDatabaseDesc (org.apache.hadoop.hive.ql.ddl.database.create.CreateDatabaseDesc)1 DescDatabaseDesc (org.apache.hadoop.hive.ql.ddl.database.desc.DescDatabaseDesc)1 DropDatabaseDesc (org.apache.hadoop.hive.ql.ddl.database.drop.DropDatabaseDesc)1 ShowDatabasesDesc (org.apache.hadoop.hive.ql.ddl.database.show.ShowDatabasesDesc)1 SwitchDatabaseDesc (org.apache.hadoop.hive.ql.ddl.database.use.SwitchDatabaseDesc)1 CreateTableDesc (org.apache.hadoop.hive.ql.ddl.table.create.CreateTableDesc)1 DescTableDesc (org.apache.hadoop.hive.ql.ddl.table.info.desc.DescTableDesc)1