Search in sources :

Example 31 with DDLWork

use of org.apache.hadoop.hive.ql.plan.DDLWork in project hive by apache.

the class DDLSemanticAnalyzer method analyzeCreateResourcePlan.

private void analyzeCreateResourcePlan(ASTNode ast) throws SemanticException {
    if (ast.getChildCount() == 0) {
        throw new SemanticException("Expected name in CREATE RESOURCE PLAN statement");
    }
    String resourcePlanName = unescapeIdentifier(ast.getChild(0).getText());
    Integer queryParallelism = null;
    String likeName = null;
    for (int i = 1; i < ast.getChildCount(); ++i) {
        Tree child = ast.getChild(i);
        switch(child.getType()) {
            case HiveParser.TOK_QUERY_PARALLELISM:
                // Note: later we may be able to set multiple things together (except LIKE).
                if (queryParallelism == null && likeName == null) {
                    queryParallelism = Integer.parseInt(child.getChild(0).getText());
                } else {
                    throw new SemanticException("Conflicting create arguments " + ast.toStringTree());
                }
                break;
            case HiveParser.TOK_LIKERP:
                if (queryParallelism == null && likeName == null) {
                    likeName = unescapeIdentifier(child.getChild(0).getText());
                } else {
                    throw new SemanticException("Conflicting create arguments " + ast.toStringTree());
                }
                break;
            default:
                throw new SemanticException("Invalid create arguments " + ast.toStringTree());
        }
    }
    CreateResourcePlanDesc desc = new CreateResourcePlanDesc(resourcePlanName, queryParallelism, likeName);
    addServiceOutput();
    rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(), desc)));
}
Also used : DDLWork(org.apache.hadoop.hive.ql.plan.DDLWork) CreateResourcePlanDesc(org.apache.hadoop.hive.ql.plan.CreateResourcePlanDesc) CommonTree(org.antlr.runtime.tree.CommonTree) Tree(org.antlr.runtime.tree.Tree) SQLUniqueConstraint(org.apache.hadoop.hive.metastore.api.SQLUniqueConstraint) NotNullConstraint(org.apache.hadoop.hive.ql.metadata.NotNullConstraint) DefaultConstraint(org.apache.hadoop.hive.ql.metadata.DefaultConstraint) SQLCheckConstraint(org.apache.hadoop.hive.metastore.api.SQLCheckConstraint) SQLNotNullConstraint(org.apache.hadoop.hive.metastore.api.SQLNotNullConstraint) SQLDefaultConstraint(org.apache.hadoop.hive.metastore.api.SQLDefaultConstraint)

Example 32 with DDLWork

use of org.apache.hadoop.hive.ql.plan.DDLWork in project hive by apache.

the class DDLSemanticAnalyzer method analyzeDropTable.

private void analyzeDropTable(ASTNode ast, TableType expectedType) throws SemanticException {
    String tableName = getUnescapedName((ASTNode) ast.getChild(0));
    boolean ifExists = (ast.getFirstChildWithType(HiveParser.TOK_IFEXISTS) != null);
    // we want to signal an error if the table/view doesn't exist and we're
    // configured not to fail silently
    boolean throwException = !ifExists && !HiveConf.getBoolVar(conf, ConfVars.DROPIGNORESNONEXISTENT);
    ReplicationSpec replicationSpec = new ReplicationSpec(ast);
    Table tab = getTable(tableName, throwException);
    if (tab != null) {
        inputs.add(new ReadEntity(tab));
        outputs.add(new WriteEntity(tab, WriteEntity.WriteType.DDL_EXCLUSIVE));
    }
    boolean ifPurge = (ast.getFirstChildWithType(HiveParser.KW_PURGE) != null);
    DropTableDesc dropTblDesc = new DropTableDesc(tableName, expectedType, ifExists, ifPurge, replicationSpec);
    rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(), dropTblDesc)));
}
Also used : ReadEntity(org.apache.hadoop.hive.ql.hooks.ReadEntity) Table(org.apache.hadoop.hive.ql.metadata.Table) DDLWork(org.apache.hadoop.hive.ql.plan.DDLWork) DropTableDesc(org.apache.hadoop.hive.ql.plan.DropTableDesc) WriteEntity(org.apache.hadoop.hive.ql.hooks.WriteEntity)

Example 33 with DDLWork

use of org.apache.hadoop.hive.ql.plan.DDLWork in project hive by apache.

the class DDLSemanticAnalyzer method analyzeDescribeTable.

/**
 * A query like this will generate a tree as follows
 *   "describe formatted default.maptable partition (b=100) id;"
 * TOK_TABTYPE
 *   TOK_TABNAME --> root for tablename, 2 child nodes mean DB specified
 *     default
 *     maptable
 *   TOK_PARTSPEC  --> root node for partition spec. else columnName
 *     TOK_PARTVAL
 *       b
 *       100
 *   id           --> root node for columnName
 * formatted
 */
private void analyzeDescribeTable(ASTNode ast) throws SemanticException {
    ASTNode tableTypeExpr = (ASTNode) ast.getChild(0);
    String dbName = null;
    String tableName = null;
    String colPath = null;
    Map<String, String> partSpec = null;
    ASTNode tableNode = null;
    // tablename is either TABLENAME or DBNAME.TABLENAME if db is given
    if (((ASTNode) tableTypeExpr.getChild(0)).getType() == HiveParser.TOK_TABNAME) {
        tableNode = (ASTNode) tableTypeExpr.getChild(0);
        if (tableNode.getChildCount() == 1) {
            tableName = ((ASTNode) tableNode.getChild(0)).getText();
        } else {
            dbName = ((ASTNode) tableNode.getChild(0)).getText();
            tableName = dbName + "." + ((ASTNode) tableNode.getChild(1)).getText();
        }
    } else {
        throw new SemanticException(((ASTNode) tableTypeExpr.getChild(0)).getText() + " is not an expected token type");
    }
    // process the second child,if exists, node to get partition spec(s)
    partSpec = QualifiedNameUtil.getPartitionSpec(db, tableTypeExpr, tableName);
    // process the third child node,if exists, to get partition spec(s)
    colPath = QualifiedNameUtil.getColPath(db, tableTypeExpr, dbName, tableName, partSpec);
    // validate database
    if (dbName != null) {
        validateDatabase(dbName);
    }
    if (partSpec != null) {
        validateTable(tableName, partSpec);
    }
    DescTableDesc descTblDesc = new DescTableDesc(ctx.getResFile(), tableName, partSpec, colPath);
    boolean showColStats = false;
    if (ast.getChildCount() == 2) {
        int descOptions = ast.getChild(1).getType();
        descTblDesc.setFormatted(descOptions == HiveParser.KW_FORMATTED);
        descTblDesc.setExt(descOptions == HiveParser.KW_EXTENDED);
        // if we are describing a table or column
        if (!colPath.equalsIgnoreCase(tableName) && descTblDesc.isFormatted()) {
            showColStats = true;
        }
    }
    inputs.add(new ReadEntity(getTable(tableName)));
    Task ddlTask = TaskFactory.get(new DDLWork(getInputs(), getOutputs(), descTblDesc));
    rootTasks.add(ddlTask);
    String schema = DescTableDesc.getSchema(showColStats);
    setFetchTask(createFetchTask(schema));
    LOG.info("analyzeDescribeTable done");
}
Also used : ReadEntity(org.apache.hadoop.hive.ql.hooks.ReadEntity) Task(org.apache.hadoop.hive.ql.exec.Task) ColumnStatsUpdateTask(org.apache.hadoop.hive.ql.exec.ColumnStatsUpdateTask) DDLWork(org.apache.hadoop.hive.ql.plan.DDLWork) DescTableDesc(org.apache.hadoop.hive.ql.plan.DescTableDesc) SQLUniqueConstraint(org.apache.hadoop.hive.metastore.api.SQLUniqueConstraint) NotNullConstraint(org.apache.hadoop.hive.ql.metadata.NotNullConstraint) DefaultConstraint(org.apache.hadoop.hive.ql.metadata.DefaultConstraint) SQLCheckConstraint(org.apache.hadoop.hive.metastore.api.SQLCheckConstraint) SQLNotNullConstraint(org.apache.hadoop.hive.metastore.api.SQLNotNullConstraint) SQLDefaultConstraint(org.apache.hadoop.hive.metastore.api.SQLDefaultConstraint)

Example 34 with DDLWork

use of org.apache.hadoop.hive.ql.plan.DDLWork in project hive by apache.

the class DDLSemanticAnalyzer method analyzeShowTableProperties.

private void analyzeShowTableProperties(ASTNode ast) throws SemanticException {
    ShowTblPropertiesDesc showTblPropertiesDesc;
    String[] qualified = getQualifiedTableName((ASTNode) ast.getChild(0));
    String propertyName = null;
    if (ast.getChildCount() > 1) {
        propertyName = unescapeSQLString(ast.getChild(1).getText());
    }
    String tableNames = getDotName(qualified);
    validateTable(tableNames, null);
    showTblPropertiesDesc = new ShowTblPropertiesDesc(ctx.getResFile().toString(), tableNames, propertyName);
    rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(), showTblPropertiesDesc)));
    setFetchTask(createFetchTask(showTblPropertiesDesc.getSchema()));
}
Also used : DDLWork(org.apache.hadoop.hive.ql.plan.DDLWork) ShowTblPropertiesDesc(org.apache.hadoop.hive.ql.plan.ShowTblPropertiesDesc)

Example 35 with DDLWork

use of org.apache.hadoop.hive.ql.plan.DDLWork in project hive by apache.

the class DDLSemanticAnalyzer method analyzeAlterTableRename.

private void analyzeAlterTableRename(String[] source, ASTNode ast, boolean expectView) throws SemanticException {
    String[] target = getQualifiedTableName((ASTNode) ast.getChild(0));
    String sourceName = getDotName(source);
    String targetName = getDotName(target);
    AlterTableDesc alterTblDesc = new AlterTableDesc(sourceName, targetName, expectView, null);
    addInputsOutputsAlterTable(sourceName, null, alterTblDesc);
    rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(), alterTblDesc)));
}
Also used : AlterTableDesc(org.apache.hadoop.hive.ql.plan.AlterTableDesc) DDLWork(org.apache.hadoop.hive.ql.plan.DDLWork)

Aggregations

DDLWork (org.apache.hadoop.hive.ql.plan.DDLWork)141 AlterTableDesc (org.apache.hadoop.hive.ql.plan.AlterTableDesc)26 SQLUniqueConstraint (org.apache.hadoop.hive.metastore.api.SQLUniqueConstraint)24 ReadEntity (org.apache.hadoop.hive.ql.hooks.ReadEntity)24 Table (org.apache.hadoop.hive.ql.metadata.Table)22 SQLCheckConstraint (org.apache.hadoop.hive.metastore.api.SQLCheckConstraint)20 SQLDefaultConstraint (org.apache.hadoop.hive.metastore.api.SQLDefaultConstraint)20 SQLNotNullConstraint (org.apache.hadoop.hive.metastore.api.SQLNotNullConstraint)20 PrincipalDesc (org.apache.hadoop.hive.ql.plan.PrincipalDesc)20 Test (org.junit.Test)20 ArrayList (java.util.ArrayList)19 DefaultConstraint (org.apache.hadoop.hive.ql.metadata.DefaultConstraint)19 NotNullConstraint (org.apache.hadoop.hive.ql.metadata.NotNullConstraint)19 HashMap (java.util.HashMap)17 LinkedHashMap (java.util.LinkedHashMap)16 WriteEntity (org.apache.hadoop.hive.ql.hooks.WriteEntity)14 Task (org.apache.hadoop.hive.ql.exec.Task)11 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)11 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)10 Serializable (java.io.Serializable)9