Search in sources :

Example 1 with ExitCodeException

use of org.apache.hadoop.util.Shell.ExitCodeException in project hadoop by apache.

the class ShellBasedUnixGroupsMapping method resolvePartialGroupNames.

/**
   * Attempt to partially resolve group names.
   *
   * @param userName the user's name
   * @param errMessage error message from the shell command
   * @param groupNames the incomplete list of group names
   * @return a list of resolved group names
   * @throws PartialGroupNameException if the resolution fails or times out
   */
private List<String> resolvePartialGroupNames(String userName, String errMessage, String groupNames) throws PartialGroupNameException {
    // does.
    if (Shell.WINDOWS) {
        throw new PartialGroupNameException("Does not support partial group" + " name resolution on Windows. " + errMessage);
    }
    if (groupNames.isEmpty()) {
        throw new PartialGroupNameException("The user name '" + userName + "' is not found. " + errMessage);
    } else {
        LOG.warn("Some group names for '{}' are not resolvable. {}", userName, errMessage);
        // attempt to partially resolve group names
        ShellCommandExecutor partialResolver = createGroupIDExecutor(userName);
        try {
            partialResolver.execute();
            return parsePartialGroupNames(groupNames, partialResolver.getOutput());
        } catch (ExitCodeException ece) {
            // something is terribly wrong, so give up.
            throw new PartialGroupNameException("failed to get group id list for user '" + userName + "'", ece);
        } catch (IOException ioe) {
            String message = "Can't execute the shell command to " + "get the list of group id for user '" + userName + "'";
            if (partialResolver.isTimedOut()) {
                message += " because of the command taking longer than " + "the configured timeout: " + timeout + " seconds";
            }
            throw new PartialGroupNameException(message, ioe);
        }
    }
}
Also used : ShellCommandExecutor(org.apache.hadoop.util.Shell.ShellCommandExecutor) IOException(java.io.IOException) ExitCodeException(org.apache.hadoop.util.Shell.ExitCodeException)

Example 2 with ExitCodeException

use of org.apache.hadoop.util.Shell.ExitCodeException in project hadoop by apache.

the class ShellBasedUnixGroupsMapping method getUnixGroups.

/**
   * Get the current user's group list from Unix by running the command 'groups'
   * NOTE. For non-existing user it will return EMPTY list.
   *
   * @param user get groups for this user
   * @return the groups list that the <code>user</code> belongs to. The primary
   *         group is returned first.
   * @throws IOException if encounter any error when running the command
   */
private List<String> getUnixGroups(String user) throws IOException {
    ShellCommandExecutor executor = createGroupExecutor(user);
    List<String> groups;
    try {
        executor.execute();
        groups = resolveFullGroupNames(executor.getOutput());
    } catch (ExitCodeException e) {
        try {
            groups = resolvePartialGroupNames(user, e.getMessage(), executor.getOutput());
        } catch (PartialGroupNameException pge) {
            LOG.warn("unable to return groups for user {}", user, pge);
            return EMPTY_GROUPS;
        }
    } catch (IOException ioe) {
        // similar to how partial resolution failures are handled above
        if (executor.isTimedOut()) {
            LOG.warn("Unable to return groups for user '{}' as shell group lookup " + "command '{}' ran longer than the configured timeout limit of " + "{} seconds.", user, Joiner.on(' ').join(executor.getExecString()), timeout);
            return EMPTY_GROUPS;
        } else {
            // If its not an executor timeout, we should let the caller handle it
            throw ioe;
        }
    }
    // remove duplicated primary group
    if (!Shell.WINDOWS) {
        for (int i = 1; i < groups.size(); i++) {
            if (groups.get(i).equals(groups.get(0))) {
                groups.remove(i);
                break;
            }
        }
    }
    return groups;
}
Also used : ShellCommandExecutor(org.apache.hadoop.util.Shell.ShellCommandExecutor) IOException(java.io.IOException) ExitCodeException(org.apache.hadoop.util.Shell.ExitCodeException)

Example 3 with ExitCodeException

use of org.apache.hadoop.util.Shell.ExitCodeException in project hadoop by apache.

the class ProcessTree method isAlive.

/**
   * Is the process with PID pid still alive?
   * This method assumes that isAlive is called on a pid that was alive not
   * too long ago, and hence assumes no chance of pid-wrapping-around.
   * 
   * @param pid pid of the process to check.
   * @return true if process is alive.
   */
public static boolean isAlive(String pid) {
    ShellCommandExecutor shexec = null;
    try {
        String[] args = { "kill", "-0", pid };
        shexec = new ShellCommandExecutor(args);
        shexec.execute();
    } catch (ExitCodeException ee) {
        return false;
    } catch (IOException ioe) {
        LOG.warn("Error executing shell command " + shexec.toString() + ioe);
        return false;
    }
    return (shexec.getExitCode() == 0 ? true : false);
}
Also used : ShellCommandExecutor(org.apache.hadoop.util.Shell.ShellCommandExecutor) IOException(java.io.IOException) ExitCodeException(org.apache.hadoop.util.Shell.ExitCodeException)

Example 4 with ExitCodeException

use of org.apache.hadoop.util.Shell.ExitCodeException in project hadoop by apache.

the class TestContainerLaunch method testInvalidSymlinkDiagnostics.

// test the diagnostics are generated
@Test(timeout = 20000)
public void testInvalidSymlinkDiagnostics() throws IOException {
    File shellFile = null;
    File tempFile = null;
    String symLink = Shell.WINDOWS ? "test.cmd" : "test";
    File symLinkFile = null;
    try {
        shellFile = Shell.appendScriptExtension(tmpDir, "hello");
        tempFile = Shell.appendScriptExtension(tmpDir, "temp");
        String timeoutCommand = Shell.WINDOWS ? "@echo \"hello\"" : "echo \"hello\"";
        PrintWriter writer = new PrintWriter(new FileOutputStream(shellFile));
        FileUtil.setExecutable(shellFile, true);
        writer.println(timeoutCommand);
        writer.close();
        Map<Path, List<String>> resources = new HashMap<Path, List<String>>();
        //This is an invalid path and should throw exception because of No such file.
        Path invalidPath = new Path(shellFile.getAbsolutePath() + "randomPath");
        resources.put(invalidPath, Arrays.asList(symLink));
        FileOutputStream fos = new FileOutputStream(tempFile);
        Map<String, String> env = new HashMap<String, String>();
        List<String> commands = new ArrayList<String>();
        if (Shell.WINDOWS) {
            commands.add("cmd");
            commands.add("/c");
            commands.add("\"" + symLink + "\"");
        } else {
            commands.add("/bin/sh ./\\\"" + symLink + "\\\"");
        }
        DefaultContainerExecutor defaultContainerExecutor = new DefaultContainerExecutor();
        defaultContainerExecutor.setConf(new YarnConfiguration());
        defaultContainerExecutor.writeLaunchEnv(fos, env, resources, commands, new Path(localLogDir.getAbsolutePath()), "user");
        fos.flush();
        fos.close();
        FileUtil.setExecutable(tempFile, true);
        Shell.ShellCommandExecutor shexc = new Shell.ShellCommandExecutor(new String[] { tempFile.getAbsolutePath() }, tmpDir);
        String diagnostics = null;
        try {
            shexc.execute();
            Assert.fail("Should catch exception");
        } catch (ExitCodeException e) {
            diagnostics = e.getMessage();
        }
        Assert.assertNotNull(diagnostics);
        Assert.assertTrue(shexc.getExitCode() != 0);
        symLinkFile = new File(tmpDir, symLink);
    } finally {
        // cleanup
        if (shellFile != null && shellFile.exists()) {
            shellFile.delete();
        }
        if (tempFile != null && tempFile.exists()) {
            tempFile.delete();
        }
        if (symLinkFile != null && symLinkFile.exists()) {
            symLinkFile.delete();
        }
    }
}
Also used : Path(org.apache.hadoop.fs.Path) HashMap(java.util.HashMap) ArrayList(java.util.ArrayList) ExitCodeException(org.apache.hadoop.util.Shell.ExitCodeException) DefaultContainerExecutor(org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor) Shell(org.apache.hadoop.util.Shell) YarnConfiguration(org.apache.hadoop.yarn.conf.YarnConfiguration) FileOutputStream(java.io.FileOutputStream) List(java.util.List) ArrayList(java.util.ArrayList) JarFile(java.util.jar.JarFile) File(java.io.File) PrintWriter(java.io.PrintWriter) BaseContainerManagerTest(org.apache.hadoop.yarn.server.nodemanager.containermanager.BaseContainerManagerTest) Test(org.junit.Test)

Example 5 with ExitCodeException

use of org.apache.hadoop.util.Shell.ExitCodeException in project hbase by apache.

the class ThriftServer method printUsageAndExit.

private static void printUsageAndExit(Options options, int exitCode) throws ExitCodeException {
    HelpFormatter formatter = new HelpFormatter();
    formatter.printHelp("Thrift", null, options, "To start the Thrift server run 'hbase-daemon.sh start thrift'\n" + "To shutdown the thrift server run 'hbase-daemon.sh stop " + "thrift' or send a kill signal to the thrift server pid", true);
    throw new ExitCodeException(exitCode, "");
}
Also used : HelpFormatter(org.apache.commons.cli.HelpFormatter) ExitCodeException(org.apache.hadoop.util.Shell.ExitCodeException)

Aggregations

ExitCodeException (org.apache.hadoop.util.Shell.ExitCodeException)10 IOException (java.io.IOException)6 ShellCommandExecutor (org.apache.hadoop.util.Shell.ShellCommandExecutor)6 File (java.io.File)3 FileOutputStream (java.io.FileOutputStream)3 ArrayList (java.util.ArrayList)3 HashMap (java.util.HashMap)3 List (java.util.List)3 JarFile (java.util.jar.JarFile)3 Path (org.apache.hadoop.fs.Path)3 Shell (org.apache.hadoop.util.Shell)3 YarnConfiguration (org.apache.hadoop.yarn.conf.YarnConfiguration)3 DefaultContainerExecutor (org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor)3 BaseContainerManagerTest (org.apache.hadoop.yarn.server.nodemanager.containermanager.BaseContainerManagerTest)3 Test (org.junit.Test)3 PrintWriter (java.io.PrintWriter)2 BufferedReader (java.io.BufferedReader)1 FileNotFoundException (java.io.FileNotFoundException)1 StringReader (java.io.StringReader)1 HelpFormatter (org.apache.commons.cli.HelpFormatter)1