Search in sources :

Example 1 with TruncateTableDesc

use of org.apache.hadoop.hive.ql.ddl.table.misc.truncate.TruncateTableDesc in project hive by apache.

the class TruncateTableHandler method handle.

@Override
public List<Task<?>> handle(Context context) throws SemanticException {
    AlterTableMessage msg = deserializer.getAlterTableMessage(context.dmd.getPayload());
    final TableName tName = TableName.fromString(msg.getTable(), null, context.isDbNameEmpty() ? msg.getDB() : context.dbName);
    TruncateTableDesc truncateTableDesc = new TruncateTableDesc(tName, null, context.eventOnlyReplicationSpec());
    truncateTableDesc.setWriteId(msg.getWriteId());
    Task<DDLWork> truncateTableTask = TaskFactory.get(new DDLWork(readEntitySet, writeEntitySet, truncateTableDesc, true, context.getDumpDirectory(), context.getMetricCollector()), context.hiveConf);
    context.log.debug("Added truncate tbl task : {}:{}:{}", truncateTableTask.getId(), truncateTableDesc.getTableName(), truncateTableDesc.getWriteId());
    updatedMetadata.set(context.dmd.getEventTo().toString(), tName.getDb(), tName.getTable(), null);
    try {
        return ReplUtils.addChildTask(truncateTableTask);
    } catch (Exception e) {
        throw new SemanticException(e.getMessage());
    }
}
Also used : TableName(org.apache.hadoop.hive.common.TableName) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) TruncateTableDesc(org.apache.hadoop.hive.ql.ddl.table.misc.truncate.TruncateTableDesc) AlterTableMessage(org.apache.hadoop.hive.metastore.messaging.AlterTableMessage) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Example 2 with TruncateTableDesc

use of org.apache.hadoop.hive.ql.ddl.table.misc.truncate.TruncateTableDesc in project hive by apache.

the class TruncatePartitionHandler method handle.

@Override
public List<Task<?>> handle(Context context) throws SemanticException {
    AlterPartitionMessage msg = deserializer.getAlterPartitionMessage(context.dmd.getPayload());
    final TableName tName = TableName.fromString(msg.getTable(), null, context.isDbNameEmpty() ? msg.getDB() : context.dbName);
    Map<String, String> partSpec = new LinkedHashMap<>();
    org.apache.hadoop.hive.metastore.api.Table tblObj;
    try {
        tblObj = msg.getTableObj();
        Iterator<String> afterIterator = msg.getPtnObjAfter().getValuesIterator();
        for (FieldSchema fs : tblObj.getPartitionKeys()) {
            partSpec.put(fs.getName(), afterIterator.next());
        }
    } catch (Exception e) {
        if (!(e instanceof SemanticException)) {
            throw new SemanticException("Error reading message members", e);
        } else {
            throw (SemanticException) e;
        }
    }
    TruncateTableDesc truncateTableDesc = new TruncateTableDesc(tName, partSpec, context.eventOnlyReplicationSpec());
    truncateTableDesc.setWriteId(msg.getWriteId());
    Task<DDLWork> truncatePtnTask = TaskFactory.get(new DDLWork(readEntitySet, writeEntitySet, truncateTableDesc, true, context.getDumpDirectory(), context.getMetricCollector()), context.hiveConf);
    context.log.debug("Added truncate ptn task : {}:{}:{}", truncatePtnTask.getId(), truncateTableDesc.getTableName(), truncateTableDesc.getWriteId());
    updatedMetadata.set(context.dmd.getEventTo().toString(), tName.getDb(), tName.getTable(), partSpec);
    try {
        return ReplUtils.addChildTask(truncatePtnTask);
    } catch (Exception e) {
        throw new SemanticException(e.getMessage());
    }
}
Also used : TruncateTableDesc(org.apache.hadoop.hive.ql.ddl.table.misc.truncate.TruncateTableDesc) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) LinkedHashMap(java.util.LinkedHashMap) TableName(org.apache.hadoop.hive.common.TableName) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) AlterPartitionMessage(org.apache.hadoop.hive.metastore.messaging.AlterPartitionMessage) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Aggregations

TableName (org.apache.hadoop.hive.common.TableName)2 DDLWork (org.apache.hadoop.hive.ql.ddl.DDLWork)2 TruncateTableDesc (org.apache.hadoop.hive.ql.ddl.table.misc.truncate.TruncateTableDesc)2 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)2 LinkedHashMap (java.util.LinkedHashMap)1 FieldSchema (org.apache.hadoop.hive.metastore.api.FieldSchema)1 AlterPartitionMessage (org.apache.hadoop.hive.metastore.messaging.AlterPartitionMessage)1 AlterTableMessage (org.apache.hadoop.hive.metastore.messaging.AlterTableMessage)1