Search in sources :

Example 1 with AnalyticJob

use of com.linkedin.drelephant.analysis.AnalyticJob in project dr-elephant by linkedin.

the class MapReduceFSFetcherHadoop2Test method testGetHistoryDir.

@Test
public void testGetHistoryDir() {
    FetcherConfiguration fetcherConf = new FetcherConfiguration(document9.getDocumentElement());
    try {
        MapReduceFSFetcherHadoop2 fetcher = new MapReduceFSFetcherHadoop2(fetcherConf.getFetchersConfigurationData().get(0));
        Calendar timestamp = Calendar.getInstance();
        timestamp.set(2016, Calendar.JULY, 30);
        AnalyticJob job = new AnalyticJob().setAppId("application_1461566847127_84624").setFinishTime(timestamp.getTimeInMillis());
        String expected = StringUtils.join(new String[] { fetcher.getHistoryLocation(), "2016", "07", "30", "000084", "" }, File.separator);
        Assert.assertEquals("Error history directory", expected, fetcher.getHistoryDir(job));
    } catch (IOException e) {
        Assert.assertNull("Failed to initialize FileSystem", e);
    }
}
Also used : FetcherConfiguration(com.linkedin.drelephant.configurations.fetcher.FetcherConfiguration) AnalyticJob(com.linkedin.drelephant.analysis.AnalyticJob) Calendar(java.util.Calendar) IOException(java.io.IOException) Test(org.junit.Test)

Example 2 with AnalyticJob

use of com.linkedin.drelephant.analysis.AnalyticJob in project dr-elephant by linkedin.

the class ElephantRunner method run.

@Override
public void run() {
    logger.info("Dr.elephant has started");
    try {
        _hadoopSecurity = HadoopSecurity.getInstance();
        _hadoopSecurity.doAs(new PrivilegedAction<Void>() {

            @Override
            public Void run() {
                HDFSContext.load();
                loadGeneralConfiguration();
                loadAnalyticJobGenerator();
                ElephantContext.init();
                // Initialize the metrics registries.
                MetricsController.init();
                logger.info("executor num is " + _executorNum);
                if (_executorNum < 1) {
                    throw new RuntimeException("Must have at least 1 worker thread.");
                }
                ThreadFactory factory = new ThreadFactoryBuilder().setNameFormat("dr-el-executor-thread-%d").build();
                _threadPoolExecutor = new ThreadPoolExecutor(_executorNum, _executorNum, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>(), factory);
                while (_running.get() && !Thread.currentThread().isInterrupted()) {
                    _analyticJobGenerator.updateResourceManagerAddresses();
                    lastRun = System.currentTimeMillis();
                    logger.info("Fetching analytic job list...");
                    try {
                        _hadoopSecurity.checkLogin();
                    } catch (IOException e) {
                        logger.info("Error with hadoop kerberos login", e);
                        // Wait for a while before retry
                        waitInterval(_retryInterval);
                        continue;
                    }
                    List<AnalyticJob> todos;
                    try {
                        todos = _analyticJobGenerator.fetchAnalyticJobs();
                    } catch (Exception e) {
                        logger.error("Error fetching job list. Try again later...", e);
                        // Wait for a while before retry
                        waitInterval(_retryInterval);
                        continue;
                    }
                    for (AnalyticJob analyticJob : todos) {
                        _threadPoolExecutor.submit(new ExecutorJob(analyticJob));
                    }
                    int queueSize = _threadPoolExecutor.getQueue().size();
                    MetricsController.setQueueSize(queueSize);
                    logger.info("Job queue size is " + queueSize);
                    // Wait for a while before next fetch
                    waitInterval(_fetchInterval);
                }
                logger.info("Main thread is terminated.");
                return null;
            }
        });
    } catch (Exception e) {
        logger.error(e.getMessage());
        logger.error(ExceptionUtils.getStackTrace(e));
    }
}
Also used : IOException(java.io.IOException) IOException(java.io.IOException) SocketTimeoutException(java.net.SocketTimeoutException) AnalyticJob(com.linkedin.drelephant.analysis.AnalyticJob) ThreadFactoryBuilder(com.google.common.util.concurrent.ThreadFactoryBuilder) List(java.util.List)

Aggregations

AnalyticJob (com.linkedin.drelephant.analysis.AnalyticJob)2 IOException (java.io.IOException)2 ThreadFactoryBuilder (com.google.common.util.concurrent.ThreadFactoryBuilder)1 FetcherConfiguration (com.linkedin.drelephant.configurations.fetcher.FetcherConfiguration)1 SocketTimeoutException (java.net.SocketTimeoutException)1 Calendar (java.util.Calendar)1 List (java.util.List)1 Test (org.junit.Test)1