Search in sources :

Example 6 with Retryable

use of org.apache.hadoop.hive.ql.exec.util.Retryable in project hive by apache.

the class FileList method writeWithRetry.

private synchronized void writeWithRetry(String entry) throws IOException {
    Retryable retryable = buildRetryable();
    try {
        retryable.executeCallable((Callable<Void>) () -> {
            if (this.abortOperation) {
                LOG.debug("Aborting write operation for entry {} to file {}.", entry, backingFile);
                return null;
            }
            try {
                if (backingFileWriter == null) {
                    backingFileWriter = initWriter();
                }
                backingFileWriter.writeBytes(getEntryWithNewline(entry));
                backingFileWriter.hflush();
                LOG.info("Writing entry {} to file list backed by {}", entry, backingFile);
            } catch (IOException e) {
                LOG.error("Writing entry {} to file list {} failed, attempting retry.", entry, backingFile, e);
                this.retryMode = true;
                close();
                throw e;
            }
            return null;
        });
    } catch (Exception e) {
        this.abortOperation = true;
        throw new IOException(ErrorMsg.REPL_RETRY_EXHAUSTED.format(e.getMessage()));
    }
}
Also used : Retryable(org.apache.hadoop.hive.ql.exec.util.Retryable) IOException(java.io.IOException) UncheckedIOException(java.io.UncheckedIOException) IOException(java.io.IOException) UncheckedIOException(java.io.UncheckedIOException) NoSuchElementException(java.util.NoSuchElementException)

Example 7 with Retryable

use of org.apache.hadoop.hive.ql.exec.util.Retryable in project hive by apache.

the class RangerRestClientImpl method saveRangerPoliciesToFile.

@Override
public Path saveRangerPoliciesToFile(RangerExportPolicyList rangerExportPolicyList, Path stagingDirPath, String fileName, HiveConf conf) throws SemanticException {
    Gson gson = new GsonBuilder().create();
    String jsonRangerExportPolicyList = gson.toJson(rangerExportPolicyList);
    Retryable retryable = Retryable.builder().withHiveConf(conf).withRetryOnException(IOException.class).build();
    try {
        return retryable.executeCallable(() -> writeExportedRangerPoliciesToJsonFile(jsonRangerExportPolicyList, fileName, stagingDirPath, conf));
    } catch (Exception e) {
        throw new SemanticException(ErrorMsg.REPL_RETRY_EXHAUSTED.format(e.getMessage()), e);
    }
}
Also used : GsonBuilder(com.google.gson.GsonBuilder) Retryable(org.apache.hadoop.hive.ql.exec.util.Retryable) Gson(com.google.gson.Gson) IOException(java.io.IOException) URISyntaxException(java.net.URISyntaxException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Example 8 with Retryable

use of org.apache.hadoop.hive.ql.exec.util.Retryable in project hive by apache.

the class RangerRestClientImpl method deleteRangerPolicy.

@Override
public void deleteRangerPolicy(String policyName, String baseUrl, String rangerHiveServiceName, HiveConf hiveConf) throws Exception {
    String finalUrl = getRangerDeleteUrl(baseUrl, policyName, rangerHiveServiceName);
    LOG.debug("URL to delete policy on target Ranger: {}", finalUrl);
    Retryable retryable = Retryable.builder().withHiveConf(hiveConf).withFailOnException(RuntimeException.class).withRetryOnException(Exception.class).build();
    try {
        retryable.executeCallable(() -> {
            ClientResponse clientResp = null;
            WebResource.Builder builder = getRangerResourceBuilder(finalUrl, hiveConf);
            clientResp = builder.delete(ClientResponse.class);
            if (clientResp != null) {
                switch(clientResp.getStatus()) {
                    case HttpServletResponse.SC_NO_CONTENT:
                        LOG.debug("Ranger policy: {} deleted successfully", policyName);
                        break;
                    case HttpServletResponse.SC_NOT_FOUND:
                        LOG.debug("Ranger policy: {} not found.", policyName);
                        break;
                    case HttpServletResponse.SC_FORBIDDEN:
                        throw new RuntimeException(ErrorMsg.RANGER_AUTHORIZATION_FAILED.getMsg());
                    case HttpServletResponse.SC_UNAUTHORIZED:
                        throw new RuntimeException(ErrorMsg.RANGER_AUTHENTICATION_FAILED.getMsg());
                    default:
                        throw new SemanticException("Ranger policy deletion failed, Please refer target Ranger admin logs.");
                }
            }
            return null;
        });
    } catch (RuntimeException e) {
        throw e;
    } catch (Exception e) {
        throw new SemanticException(ErrorMsg.REPL_RETRY_EXHAUSTED.format(e.getMessage()), e);
    }
}
Also used : ClientResponse(com.sun.jersey.api.client.ClientResponse) Retryable(org.apache.hadoop.hive.ql.exec.util.Retryable) WebResource(com.sun.jersey.api.client.WebResource) URISyntaxException(java.net.URISyntaxException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Example 9 with Retryable

use of org.apache.hadoop.hive.ql.exec.util.Retryable in project hive by apache.

the class Utils method writeFile.

public static long writeFile(FileSystem fs, Path exportFilePath, InputStream is, HiveConf conf) throws SemanticException {
    Retryable retryable = Retryable.builder().withHiveConf(conf).withRetryOnException(IOException.class).build();
    try {
        return retryable.executeCallable(() -> {
            FSDataOutputStream fos = null;
            try {
                long bytesWritten;
                fos = fs.create(exportFilePath);
                byte[] buffer = new byte[DEF_BUF_SIZE];
                int bytesRead;
                while ((bytesRead = is.read(buffer)) != -1) {
                    fos.write(buffer, 0, bytesRead);
                }
                bytesWritten = fos.getPos();
                return bytesWritten;
            } finally {
                if (fos != null) {
                    fos.close();
                }
            }
        });
    } catch (Exception e) {
        throw new SemanticException(e);
    }
}
Also used : Retryable(org.apache.hadoop.hive.ql.exec.util.Retryable) IOException(java.io.IOException) FSDataOutputStream(org.apache.hadoop.fs.FSDataOutputStream) URISyntaxException(java.net.URISyntaxException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) HiveException(org.apache.hadoop.hive.ql.metadata.HiveException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Example 10 with Retryable

use of org.apache.hadoop.hive.ql.exec.util.Retryable in project hive by apache.

the class FileOperations method exportFilesAsList.

/**
 * This needs the root data directory to which the data needs to be exported to.
 * The data export here is a list of files either in table/partition that are written to the _files
 * in the exportRootDataDir provided.
 */
void exportFilesAsList() throws SemanticException {
    if (dataPathList.isEmpty()) {
        return;
    }
    Retryable retryable = Retryable.builder().withHiveConf(hiveConf).withRetryOnException(IOException.class).build();
    try {
        retryable.executeCallable((Callable<Void>) () -> {
            try (BufferedWriter writer = writer()) {
                for (Path dataPath : dataPathList) {
                    writeFilesList(listFilesInDir(dataPath), writer, AcidUtils.getAcidSubDir(dataPath));
                }
            } catch (IOException e) {
                if (e instanceof FileNotFoundException) {
                    logger.error("exporting data files in dir : " + dataPathList + " to " + exportRootDataDir + " failed");
                    throw new FileNotFoundException(FILE_NOT_FOUND.format(e.getMessage()));
                }
                // in case of io error, reset the file system object
                FileSystem.closeAllForUGI(Utils.getUGI());
                dataFileSystem = dataPathList.get(0).getFileSystem(hiveConf);
                exportFileSystem = exportRootDataDir.getFileSystem(hiveConf);
                Path exportPath = new Path(exportRootDataDir, EximUtil.FILES_NAME);
                if (exportFileSystem.exists(exportPath)) {
                    exportFileSystem.delete(exportPath, true);
                }
                throw e;
            }
            return null;
        });
    } catch (Exception e) {
        throw new SemanticException(e);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) Retryable(org.apache.hadoop.hive.ql.exec.util.Retryable) FileNotFoundException(java.io.FileNotFoundException) IOException(java.io.IOException) LoginException(javax.security.auth.login.LoginException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) BufferedWriter(java.io.BufferedWriter) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException)

Aggregations

IOException (java.io.IOException)19 Retryable (org.apache.hadoop.hive.ql.exec.util.Retryable)19 SemanticException (org.apache.hadoop.hive.ql.parse.SemanticException)13 FileNotFoundException (java.io.FileNotFoundException)11 Path (org.apache.hadoop.fs.Path)8 FileSystem (org.apache.hadoop.fs.FileSystem)6 UncheckedIOException (java.io.UncheckedIOException)5 URISyntaxException (java.net.URISyntaxException)5 ArrayList (java.util.ArrayList)4 SnapshotException (org.apache.hadoop.hdfs.protocol.SnapshotException)4 BufferedReader (java.io.BufferedReader)3 InputStreamReader (java.io.InputStreamReader)3 Task (org.apache.hadoop.hive.ql.exec.Task)3 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)3 Gson (com.google.gson.Gson)2 GsonBuilder (com.google.gson.GsonBuilder)2 MalformedURLException (java.net.MalformedURLException)2 NoSuchElementException (java.util.NoSuchElementException)2 AtomicBoolean (java.util.concurrent.atomic.AtomicBoolean)2 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)2