Search in sources :

Example 1 with AlterTableRenamePartitionDesc

use of org.apache.hadoop.hive.ql.ddl.table.partition.rename.AlterTableRenamePartitionDesc in project hive by apache.

the class RenamePartitionHandler method handle.

@Override
public List<Task<?>> handle(Context context) throws SemanticException {
    AlterPartitionMessage msg = deserializer.getAlterPartitionMessage(context.dmd.getPayload());
    String actualDbName = context.isDbNameEmpty() ? msg.getDB() : context.dbName;
    String actualTblName = msg.getTable();
    Map<String, String> newPartSpec = new LinkedHashMap<>();
    Map<String, String> oldPartSpec = new LinkedHashMap<>();
    TableName tableName = TableName.fromString(actualTblName, null, actualDbName);
    Table tableObj;
    ReplicationSpec replicationSpec = context.eventOnlyReplicationSpec();
    try {
        Iterator<String> beforeIterator = msg.getPtnObjBefore().getValuesIterator();
        Iterator<String> afterIterator = msg.getPtnObjAfter().getValuesIterator();
        tableObj = msg.getTableObj();
        for (FieldSchema fs : tableObj.getPartitionKeys()) {
            oldPartSpec.put(fs.getName(), beforeIterator.next());
            newPartSpec.put(fs.getName(), afterIterator.next());
        }
        AlterTableRenamePartitionDesc renamePtnDesc = new AlterTableRenamePartitionDesc(tableName, oldPartSpec, newPartSpec, replicationSpec, null);
        renamePtnDesc.setWriteId(msg.getWriteId());
        Task<DDLWork> renamePtnTask = TaskFactory.get(new DDLWork(readEntitySet, writeEntitySet, renamePtnDesc, true, context.getDumpDirectory(), context.getMetricCollector()), context.hiveConf);
        context.log.debug("Added rename ptn task : {}:{}->{}", renamePtnTask.getId(), oldPartSpec, newPartSpec);
        updatedMetadata.set(context.dmd.getEventTo().toString(), actualDbName, actualTblName, newPartSpec);
        return ReplUtils.addChildTask(renamePtnTask);
    } catch (Exception e) {
        throw (e instanceof SemanticException) ? (SemanticException) e : new SemanticException("Error reading message members", e);
    }
}
Also used : ReplicationSpec(org.apache.hadoop.hive.ql.parse.ReplicationSpec) Table(org.apache.hadoop.hive.metastore.api.Table) AlterTableRenamePartitionDesc(org.apache.hadoop.hive.ql.ddl.table.partition.rename.AlterTableRenamePartitionDesc) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) LinkedHashMap(java.util.LinkedHashMap) TableName(org.apache.hadoop.hive.common.TableName) DDLWork(org.apache.hadoop.hive.ql.ddl.DDLWork) AlterPartitionMessage(org.apache.hadoop.hive.metastore.messaging.AlterPartitionMessage) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Aggregations

LinkedHashMap (java.util.LinkedHashMap)1 TableName (org.apache.hadoop.hive.common.TableName)1 FieldSchema (org.apache.hadoop.hive.metastore.api.FieldSchema)1 Table (org.apache.hadoop.hive.metastore.api.Table)1 AlterPartitionMessage (org.apache.hadoop.hive.metastore.messaging.AlterPartitionMessage)1 DDLWork (org.apache.hadoop.hive.ql.ddl.DDLWork)1 AlterTableRenamePartitionDesc (org.apache.hadoop.hive.ql.ddl.table.partition.rename.AlterTableRenamePartitionDesc)1 ReplicationSpec (org.apache.hadoop.hive.ql.parse.ReplicationSpec)1 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)1