Search in sources :

Example 1 with Application

use of com.microsoft.azure.hdinsight.sdk.rest.spark.Application in project azure-tools-for-java by Microsoft.

the class SparkJobHttpHandler method handle.

@Override
public void handle(HttpExchange httpExchange) throws IOException {
    httpExchange.getResponseHeaders().add("Access-Control-Allow-Origin", "*");
    JobRequestDetails requestDetail = JobRequestDetails.getJobRequestDetail(httpExchange);
    try {
        String path = requestDetail.getRequestPath();
        if (path.equalsIgnoreCase("/applications/") && requestDetail.getAppId().equalsIgnoreCase("0")) {
            try {
                List<Application> applications = SparkRestUtil.getSparkApplications(requestDetail.getCluster());
                Optional<String> responseString = ObjectConvertUtils.convertObjectToJsonString(applications);
                JobUtils.setResponse(httpExchange, responseString.orElseThrow(IOException::new));
            } catch (HDIException e) {
                DefaultLoader.getUIHelper().logError("get applications list error", e);
            }
        } else if (path.contains("application_graph")) {
            ApplicationKey key = new ApplicationKey(requestDetail.getCluster(), requestDetail.getAppId());
            List<Job> jobs = JobViewCacheManager.getJob(key);
            App app = JobViewCacheManager.getYarnApp(key);
            List<JobStartEventLog> jobStartEventLogs = JobViewCacheManager.getJobStartEventLogs(key);
            YarnAppWithJobs yarnAppWithJobs = new YarnAppWithJobs(app, jobs, jobStartEventLogs);
            Optional<String> responseString = ObjectConvertUtils.convertObjectToJsonString(yarnAppWithJobs);
            JobUtils.setResponse(httpExchange, responseString.orElseThrow(IOException::new));
        } else if (path.contains("stages_summary")) {
            List<Stage> stages = JobViewCacheManager.getStages(new ApplicationKey(requestDetail.getCluster(), requestDetail.getAppId()));
            Optional<String> responseString = ObjectConvertUtils.convertObjectToJsonString(stages);
            JobUtils.setResponse(httpExchange, responseString.orElseThrow(IOException::new));
        } else if (path.contains("executors_summary")) {
            List<Executor> executors = JobViewCacheManager.getExecutors(new ApplicationKey(requestDetail.getCluster(), requestDetail.getAppId()));
            Optional<String> responseString = ObjectConvertUtils.convertObjectToJsonString(executors);
            JobUtils.setResponse(httpExchange, responseString.orElseThrow(IOException::new));
        } else if (path.contains("tasks_summary")) {
            List<Task> tasks = JobViewCacheManager.getTasks(new ApplicationKey(requestDetail.getCluster(), requestDetail.getAppId()));
            Optional<String> responseString = ObjectConvertUtils.convertObjectToJsonString(tasks);
            JobUtils.setResponse(httpExchange, responseString.orElseThrow(IOException::new));
        }
    } catch (ExecutionException e) {
        JobUtils.setResponse(httpExchange, e.getMessage(), 500);
    }
}
Also used : App(com.microsoft.azure.hdinsight.sdk.rest.yarn.rm.App) Task(com.microsoft.azure.hdinsight.sdk.rest.spark.task.Task) Optional(java.util.Optional) JobRequestDetails(com.microsoft.azure.hdinsight.spark.jobs.framework.JobRequestDetails) HDIException(com.microsoft.azure.hdinsight.sdk.common.HDIException) IOException(java.io.IOException) Stage(com.microsoft.azure.hdinsight.sdk.rest.spark.stage.Stage) List(java.util.List) ExecutionException(java.util.concurrent.ExecutionException) Application(com.microsoft.azure.hdinsight.sdk.rest.spark.Application) YarnAppWithJobs(com.microsoft.azure.hdinsight.sdk.rest.spark.YarnAppWithJobs)

Example 2 with Application

use of com.microsoft.azure.hdinsight.sdk.rest.spark.Application in project azure-tools-for-java by Microsoft.

the class ActionHttpHandler method handle.

@Override
public void handle(HttpExchange httpExchange) throws IOException {
    httpExchange.getResponseHeaders().add("Access-Control-Allow-Origin", "*");
    JobRequestDetails requestDetail = JobRequestDetails.getJobRequestDetail(httpExchange);
    final String path = requestDetail.getRequestPath();
    final String clusterConnectString = requestDetail.getCluster().getConnectionUrl();
    if (path.contains("yarnui")) {
        JobUtils.openYarnUIHistory(clusterConnectString, requestDetail.getAppId());
    } else if (path.contains("sparkui")) {
        try {
            Application application = JobViewCacheManager.getSingleSparkApplication(new ApplicationKey(requestDetail.getCluster(), requestDetail.getAppId()));
            JobUtils.openSparkUIHistory(clusterConnectString, requestDetail.getAppId(), application.getLastAttemptId());
            JobUtils.setResponse(httpExchange, "open browser successfully");
        } catch (ExecutionException e) {
            JobUtils.setResponse(httpExchange, "open browser error", 500);
            DefaultLoader.getUIHelper().showError(e.getMessage(), "open browser error");
        }
    }
}
Also used : JobRequestDetails(com.microsoft.azure.hdinsight.spark.jobs.framework.JobRequestDetails) ExecutionException(java.util.concurrent.ExecutionException) Application(com.microsoft.azure.hdinsight.sdk.rest.spark.Application)

Example 3 with Application

use of com.microsoft.azure.hdinsight.sdk.rest.spark.Application in project azure-tools-for-java by Microsoft.

the class SparkRestUtil method getSparkApplications.

@NotNull
public static List<Application> getSparkApplications(@NotNull IClusterDetail clusterDetail) throws HDIException, IOException {
    HttpEntity entity = getSparkRestEntity(clusterDetail, "");
    Optional<List<Application>> apps = ObjectConvertUtils.convertEntityToList(entity, Application.class);
    // spark job has at least one attempt
    return apps.orElse(RestUtil.getEmptyList(Application.class)).stream().filter(app -> app.getAttempts().size() != 0 && app.getAttempts().get(0).getAttemptId() != null).collect(Collectors.toList());
}
Also used : java.util(java.util) Application(com.microsoft.azure.hdinsight.sdk.rest.spark.Application) NotNull(com.microsoft.azuretools.azurecommons.helpers.NotNull) Task(com.microsoft.azure.hdinsight.sdk.rest.spark.task.Task) ObjectConvertUtils(com.microsoft.azure.hdinsight.sdk.rest.ObjectConvertUtils) HDInsightLoader(com.microsoft.azure.hdinsight.common.HDInsightLoader) JSONObject(org.json.JSONObject) Charset(java.nio.charset.Charset) ZipFile(java.util.zip.ZipFile) JobStartEventLog(com.microsoft.azure.hdinsight.sdk.rest.spark.event.JobStartEventLog) AttemptWithAppId(com.microsoft.azure.hdinsight.sdk.rest.AttemptWithAppId) ZipEntry(java.util.zip.ZipEntry) IClusterDetail(com.microsoft.azure.hdinsight.sdk.cluster.IClusterDetail) HDIException(com.microsoft.azure.hdinsight.sdk.common.HDIException) HttpEntity(org.apache.http.HttpEntity) Executor(com.microsoft.azure.hdinsight.sdk.rest.spark.executor.Executor) FileUtils(org.apache.commons.io.FileUtils) IOException(java.io.IOException) Collectors(java.util.stream.Collectors) File(java.io.File) ExecutionException(java.util.concurrent.ExecutionException) IOUtils(org.apache.commons.io.IOUtils) Job(com.microsoft.azure.hdinsight.sdk.rest.spark.job.Job) Stage(com.microsoft.azure.hdinsight.sdk.rest.spark.stage.Stage) RestUtil(com.microsoft.azure.hdinsight.sdk.rest.RestUtil) InputStream(java.io.InputStream) HttpEntity(org.apache.http.HttpEntity) NotNull(com.microsoft.azuretools.azurecommons.helpers.NotNull)

Example 4 with Application

use of com.microsoft.azure.hdinsight.sdk.rest.spark.Application in project azure-tools-for-java by Microsoft.

the class SparkRestUtil method getSparkEventLogs.

public static List<JobStartEventLog> getSparkEventLogs(@NotNull ApplicationKey key) throws HDIException, IOException {
    String url = String.format("%s/logs", key.getAppId());
    String eventLogsPath = String.format("%s/SparkEventLogs/%s/eventLogs.zip", HDInsightLoader.getHDInsightHelper().getPluginRootPath(), key.getAppId());
    File file = new File(eventLogsPath);
    HttpEntity entity = getSparkRestEntity(key.getClusterDetails(), url);
    InputStream inputStream = entity.getContent();
    FileUtils.copyInputStreamToFile(inputStream, file);
    IOUtils.closeQuietly(inputStream);
    ZipFile zipFile = new ZipFile(file);
    List<? extends ZipEntry> entities = Collections.list(zipFile.entries());
    // every application has an attempt in event log
    // and the entity name should be in formation "{appId}_{attemptId}"
    String entityName = String.format("%s_%s", key.getAppId(), entities.size());
    ZipEntry lastEntity = zipFile.getEntry(entityName);
    if (lastEntity == null) {
        throw new HDIException(String.format("No Spark event log entity found for app: %s", key.getAppId()));
    }
    InputStream zipFileInputStream = zipFile.getInputStream(lastEntity);
    String entityContent = IOUtils.toString(zipFileInputStream, Charset.forName("utf-8"));
    String[] lines = entityContent.split("\n");
    List<JobStartEventLog> jobStartEvents = Arrays.stream(lines).filter(line -> {
        JSONObject jsonObject = new JSONObject(line);
        String eventName = jsonObject.getString("Event");
        return eventName.equalsIgnoreCase("SparkListenerJobStart");
    }).map(oneLine -> ObjectConvertUtils.convertToObjectQuietly(oneLine, JobStartEventLog.class)).filter(Objects::nonNull).collect(Collectors.toList());
    return jobStartEvents;
}
Also used : java.util(java.util) Application(com.microsoft.azure.hdinsight.sdk.rest.spark.Application) NotNull(com.microsoft.azuretools.azurecommons.helpers.NotNull) Task(com.microsoft.azure.hdinsight.sdk.rest.spark.task.Task) ObjectConvertUtils(com.microsoft.azure.hdinsight.sdk.rest.ObjectConvertUtils) HDInsightLoader(com.microsoft.azure.hdinsight.common.HDInsightLoader) JSONObject(org.json.JSONObject) Charset(java.nio.charset.Charset) ZipFile(java.util.zip.ZipFile) JobStartEventLog(com.microsoft.azure.hdinsight.sdk.rest.spark.event.JobStartEventLog) AttemptWithAppId(com.microsoft.azure.hdinsight.sdk.rest.AttemptWithAppId) ZipEntry(java.util.zip.ZipEntry) IClusterDetail(com.microsoft.azure.hdinsight.sdk.cluster.IClusterDetail) HDIException(com.microsoft.azure.hdinsight.sdk.common.HDIException) HttpEntity(org.apache.http.HttpEntity) Executor(com.microsoft.azure.hdinsight.sdk.rest.spark.executor.Executor) FileUtils(org.apache.commons.io.FileUtils) IOException(java.io.IOException) Collectors(java.util.stream.Collectors) File(java.io.File) ExecutionException(java.util.concurrent.ExecutionException) IOUtils(org.apache.commons.io.IOUtils) Job(com.microsoft.azure.hdinsight.sdk.rest.spark.job.Job) Stage(com.microsoft.azure.hdinsight.sdk.rest.spark.stage.Stage) RestUtil(com.microsoft.azure.hdinsight.sdk.rest.RestUtil) InputStream(java.io.InputStream) HttpEntity(org.apache.http.HttpEntity) InputStream(java.io.InputStream) ZipEntry(java.util.zip.ZipEntry) HDIException(com.microsoft.azure.hdinsight.sdk.common.HDIException) ZipFile(java.util.zip.ZipFile) JSONObject(org.json.JSONObject) JobStartEventLog(com.microsoft.azure.hdinsight.sdk.rest.spark.event.JobStartEventLog) ZipFile(java.util.zip.ZipFile) File(java.io.File)

Aggregations

Application (com.microsoft.azure.hdinsight.sdk.rest.spark.Application)4 ExecutionException (java.util.concurrent.ExecutionException)4 HDIException (com.microsoft.azure.hdinsight.sdk.common.HDIException)3 Stage (com.microsoft.azure.hdinsight.sdk.rest.spark.stage.Stage)3 Task (com.microsoft.azure.hdinsight.sdk.rest.spark.task.Task)3 IOException (java.io.IOException)3 HDInsightLoader (com.microsoft.azure.hdinsight.common.HDInsightLoader)2 IClusterDetail (com.microsoft.azure.hdinsight.sdk.cluster.IClusterDetail)2 AttemptWithAppId (com.microsoft.azure.hdinsight.sdk.rest.AttemptWithAppId)2 ObjectConvertUtils (com.microsoft.azure.hdinsight.sdk.rest.ObjectConvertUtils)2 RestUtil (com.microsoft.azure.hdinsight.sdk.rest.RestUtil)2 JobStartEventLog (com.microsoft.azure.hdinsight.sdk.rest.spark.event.JobStartEventLog)2 Executor (com.microsoft.azure.hdinsight.sdk.rest.spark.executor.Executor)2 Job (com.microsoft.azure.hdinsight.sdk.rest.spark.job.Job)2 JobRequestDetails (com.microsoft.azure.hdinsight.spark.jobs.framework.JobRequestDetails)2 NotNull (com.microsoft.azuretools.azurecommons.helpers.NotNull)2 File (java.io.File)2 InputStream (java.io.InputStream)2 Charset (java.nio.charset.Charset)2 java.util (java.util)2