Search in sources :

Example 71 with HiveException

use of org.apache.hadoop.hive.ql.metadata.HiveException in project hive by apache.

the class DDLTask method showGrants.

private int showGrants(Hive db, ShowGrantDesc showGrantDesc) throws HiveException {
    HiveAuthorizer authorizer = getSessionAuthorizer(db);
    try {
        List<HivePrivilegeInfo> privInfos = authorizer.showPrivileges(getAuthorizationTranslator(authorizer).getHivePrincipal(showGrantDesc.getPrincipalDesc()), getAuthorizationTranslator(authorizer).getHivePrivilegeObject(showGrantDesc.getHiveObj()));
        boolean testMode = conf.getBoolVar(HiveConf.ConfVars.HIVE_IN_TEST);
        writeToFile(writeGrantInfo(privInfos, testMode), showGrantDesc.getResFile());
    } catch (IOException e) {
        throw new HiveException("Error in show grant statement", e);
    }
    return 0;
}
Also used : HiveAuthorizer(org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAuthorizer) HivePrivilegeInfo(org.apache.hadoop.hive.ql.security.authorization.plugin.HivePrivilegeInfo) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) IOException(java.io.IOException)

Example 72 with HiveException

use of org.apache.hadoop.hive.ql.metadata.HiveException in project SQLWindowing by hbutani.

the class PTFOperator method processOp.

@Override
public void processOp(Object row, int tag) throws HiveException {
    try {
        if (!isMapOperator) {
            /*
				 * checkif current row belongs to the current accumulated Partition:
				 * - If not:
				 * 	- process the current Partition
				 *  - reset input Partition
				 * - set currentKey to the newKey if it is null or has changed.
				 */
            newKeys.getNewKey(row, inputPart.getOI());
            boolean keysAreEqual = (currentKeys != null && newKeys != null) ? newKeys.equals(currentKeys) : false;
            if (currentKeys != null && !keysAreEqual) {
                processInputPartition();
                inputPart = RuntimeUtils.createFirstPartitionForChain(qDef, inputObjInspectors[0], hiveConf, isMapOperator);
            }
            if (currentKeys == null || !keysAreEqual) {
                if (currentKeys == null) {
                    currentKeys = newKeys.copyKey();
                } else {
                    currentKeys.copyKey(newKeys);
                }
            }
        }
        // add row to current Partition.
        inputPart.append(row);
    } catch (WindowingException we) {
        throw new HiveException("Cannot process PTFOperator.", we);
    }
}
Also used : HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) WindowingException(com.sap.hadoop.windowing.WindowingException)

Example 73 with HiveException

use of org.apache.hadoop.hive.ql.metadata.HiveException in project hive by apache.

the class SemanticAnalyzer method comparePathKeyStrength.

/**
 * Compares to path key encryption strenghts.
 *
 * @param p1 Path to an HDFS file system
 * @param p2 Path to an HDFS file system
 * @return -1 if strength is weak; 0 if is equals; 1 if it is stronger
 * @throws HiveException If an error occurs while comparing key strengths.
 */
private int comparePathKeyStrength(Path p1, Path p2) throws HiveException {
    HadoopShims.HdfsEncryptionShim hdfsEncryptionShim;
    hdfsEncryptionShim = SessionState.get().getHdfsEncryptionShim();
    if (hdfsEncryptionShim != null) {
        try {
            return hdfsEncryptionShim.comparePathKeyStrength(p1, p2);
        } catch (Exception e) {
            throw new HiveException("Unable to compare key strength for " + p1 + " and " + p2 + " : " + e, e);
        }
    }
    // Non-encrypted path (or equals strength)
    return 0;
}
Also used : HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) HadoopShims(org.apache.hadoop.hive.shims.HadoopShims) LockException(org.apache.hadoop.hive.ql.lockmgr.LockException) IOException(java.io.IOException) CalciteSemanticException(org.apache.hadoop.hive.ql.optimizer.calcite.CalciteSemanticException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) SerDeException(org.apache.hadoop.hive.serde2.SerDeException) PatternSyntaxException(java.util.regex.PatternSyntaxException) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(java.security.AccessControlException) InvalidTableException(org.apache.hadoop.hive.ql.metadata.InvalidTableException)

Example 74 with HiveException

use of org.apache.hadoop.hive.ql.metadata.HiveException in project hive by apache.

the class SemanticAnalyzer method isPathReadOnly.

/**
 * Checks if a given path has read-only access permissions.
 *
 * @param path The path to check for read-only permissions.
 * @return True if the path is read-only; False otherwise.
 * @throws HiveException If an error occurs while checking file permissions.
 */
private boolean isPathReadOnly(Path path) throws HiveException {
    HiveConf conf = SessionState.get().getConf();
    try {
        FileSystem fs = path.getFileSystem(conf);
        UserGroupInformation ugi = Utils.getUGI();
        FileStatus status = fs.getFileStatus(path);
        // We just check for writing permissions. If it fails with AccessControException, then it
        // means the location may be read-only.
        FileUtils.checkFileAccessWithImpersonation(fs, status, FsAction.WRITE, ugi.getUserName());
        // Path has writing permissions
        return false;
    } catch (AccessControlException e) {
        // but we take it as if our path is read-only
        return true;
    } catch (Exception e) {
        throw new HiveException("Unable to determine if " + path + " is read only: " + e, e);
    }
}
Also used : FileStatus(org.apache.hadoop.fs.FileStatus) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) FileSystem(org.apache.hadoop.fs.FileSystem) AccessControlException(java.security.AccessControlException) HiveConf(org.apache.hadoop.hive.conf.HiveConf) LockException(org.apache.hadoop.hive.ql.lockmgr.LockException) IOException(java.io.IOException) CalciteSemanticException(org.apache.hadoop.hive.ql.optimizer.calcite.CalciteSemanticException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) SerDeException(org.apache.hadoop.hive.serde2.SerDeException) PatternSyntaxException(java.util.regex.PatternSyntaxException) FileNotFoundException(java.io.FileNotFoundException) AccessControlException(java.security.AccessControlException) InvalidTableException(org.apache.hadoop.hive.ql.metadata.InvalidTableException) UserGroupInformation(org.apache.hadoop.security.UserGroupInformation)

Example 75 with HiveException

use of org.apache.hadoop.hive.ql.metadata.HiveException in project hive by apache.

the class SemanticAnalyzer method getMaterializationMetadata.

public void getMaterializationMetadata(QB qb) throws SemanticException {
    try {
        gatherCTEReferences(qb, rootClause);
        int threshold = HiveConf.getIntVar(conf, HiveConf.ConfVars.HIVE_CTE_MATERIALIZE_THRESHOLD);
        for (CTEClause cte : Sets.newHashSet(aliasToCTEs.values())) {
            if (threshold >= 0 && cte.reference >= threshold) {
                cte.materialize = true;
            }
        }
    } catch (HiveException e) {
        // Has to use full name to make sure it does not conflict with
        // org.apache.commons.lang.StringUtils
        LOG.error(org.apache.hadoop.util.StringUtils.stringifyException(e));
        if (e instanceof SemanticException) {
            throw (SemanticException) e;
        }
        throw new SemanticException(e.getMessage(), e);
    }
}
Also used : HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) SQLUniqueConstraint(org.apache.hadoop.hive.metastore.api.SQLUniqueConstraint) CheckConstraint(org.apache.hadoop.hive.ql.metadata.CheckConstraint) NotNullConstraint(org.apache.hadoop.hive.ql.metadata.NotNullConstraint) SQLCheckConstraint(org.apache.hadoop.hive.metastore.api.SQLCheckConstraint) SQLDefaultConstraint(org.apache.hadoop.hive.metastore.api.SQLDefaultConstraint) DefaultConstraint(org.apache.hadoop.hive.ql.metadata.DefaultConstraint) SQLNotNullConstraint(org.apache.hadoop.hive.metastore.api.SQLNotNullConstraint) CalciteSemanticException(org.apache.hadoop.hive.ql.optimizer.calcite.CalciteSemanticException)

Aggregations

HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)451 IOException (java.io.IOException)172 ArrayList (java.util.ArrayList)81 Path (org.apache.hadoop.fs.Path)68 Table (org.apache.hadoop.hive.ql.metadata.Table)65 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)46 SerDeException (org.apache.hadoop.hive.serde2.SerDeException)45 ObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector)45 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)42 Partition (org.apache.hadoop.hive.ql.metadata.Partition)39 FileSystem (org.apache.hadoop.fs.FileSystem)31 ExprNodeDesc (org.apache.hadoop.hive.ql.plan.ExprNodeDesc)29 LinkedHashMap (java.util.LinkedHashMap)28 FieldSchema (org.apache.hadoop.hive.metastore.api.FieldSchema)28 InvalidTableException (org.apache.hadoop.hive.ql.metadata.InvalidTableException)28 FileNotFoundException (java.io.FileNotFoundException)27 URISyntaxException (java.net.URISyntaxException)27 HashMap (java.util.HashMap)26 InvalidOperationException (org.apache.hadoop.hive.metastore.api.InvalidOperationException)23 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)23