Search in sources :

Example 1 with TaskQueue

use of org.apache.druid.indexing.overlord.TaskQueue in project druid by druid-io.

the class OverlordResource method taskPost.

/**
 * Warning, magic: {@link org.apache.druid.client.indexing.HttpIndexingServiceClient#runTask} may call this method
 * remotely with {@link ClientTaskQuery} objects, but we deserialize {@link Task} objects. See the comment for {@link
 * ClientTaskQuery} for details.
 */
@POST
@Path("/task")
@Consumes(MediaType.APPLICATION_JSON)
@Produces(MediaType.APPLICATION_JSON)
public Response taskPost(final Task task, @Context final HttpServletRequest req) {
    final String dataSource = task.getDataSource();
    final ResourceAction resourceAction = new ResourceAction(new Resource(dataSource, ResourceType.DATASOURCE), Action.WRITE);
    Access authResult = AuthorizationUtils.authorizeResourceAction(req, resourceAction, authorizerMapper);
    if (!authResult.isAllowed()) {
        throw new ForbiddenException(authResult.getMessage());
    }
    return asLeaderWith(taskMaster.getTaskQueue(), new Function<TaskQueue, Response>() {

        @Override
        public Response apply(TaskQueue taskQueue) {
            try {
                taskQueue.add(task);
                return Response.ok(ImmutableMap.of("task", task.getId())).build();
            } catch (EntryExistsException e) {
                return Response.status(Response.Status.BAD_REQUEST).entity(ImmutableMap.of("error", StringUtils.format("Task[%s] already exists!", task.getId()))).build();
            }
        }
    });
}
Also used : Response(javax.ws.rs.core.Response) ForbiddenException(org.apache.druid.server.security.ForbiddenException) Resource(org.apache.druid.server.security.Resource) Access(org.apache.druid.server.security.Access) TaskQueue(org.apache.druid.indexing.overlord.TaskQueue) EntryExistsException(org.apache.druid.metadata.EntryExistsException) ResourceAction(org.apache.druid.server.security.ResourceAction) Path(javax.ws.rs.Path) POST(javax.ws.rs.POST) Consumes(javax.ws.rs.Consumes) Produces(javax.ws.rs.Produces)

Example 2 with TaskQueue

use of org.apache.druid.indexing.overlord.TaskQueue in project druid by druid-io.

the class OverlordResourceTest method testShutdownAllTasksForNonExistingDataSource.

@Test
public void testShutdownAllTasksForNonExistingDataSource() {
    final TaskQueue taskQueue = EasyMock.createMock(TaskQueue.class);
    EasyMock.expect(taskMaster.isLeader()).andReturn(true).anyTimes();
    EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes();
    EasyMock.expect(taskStorageQueryAdapter.getActiveTaskInfo(EasyMock.anyString())).andReturn(Collections.emptyList());
    EasyMock.replay(taskRunner, taskMaster, taskStorageQueryAdapter, indexerMetadataStorageAdapter, req, workerTaskRunnerQueryAdapter);
    final Response response = overlordResource.shutdownTasksForDataSource("notExisting");
    Assert.assertEquals(Status.NOT_FOUND.getStatusCode(), response.getStatus());
}
Also used : Response(javax.ws.rs.core.Response) TaskQueue(org.apache.druid.indexing.overlord.TaskQueue) Test(org.junit.Test)

Example 3 with TaskQueue

use of org.apache.druid.indexing.overlord.TaskQueue in project druid by druid-io.

the class OverlordResourceTest method testShutdownTask.

@Test
public void testShutdownTask() {
    // This is disabled since OverlordResource.doShutdown is annotated with TaskResourceFilter
    // This should be fixed in https://github.com/apache/druid/issues/6685.
    // expectAuthorizationTokenCheck();
    TaskQueue mockQueue = EasyMock.createMock(TaskQueue.class);
    EasyMock.expect(taskMaster.isLeader()).andReturn(true).anyTimes();
    EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes();
    EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(mockQueue)).anyTimes();
    mockQueue.shutdown("id_1", "Shutdown request from user");
    EasyMock.expectLastCall();
    EasyMock.replay(taskRunner, taskMaster, taskStorageQueryAdapter, indexerMetadataStorageAdapter, req, mockQueue, workerTaskRunnerQueryAdapter);
    final Map<String, Integer> response = (Map<String, Integer>) overlordResource.doShutdown("id_1").getEntity();
    Assert.assertEquals("id_1", response.get("task"));
}
Also used : TaskQueue(org.apache.druid.indexing.overlord.TaskQueue) Map(java.util.Map) ImmutableMap(com.google.common.collect.ImmutableMap) Test(org.junit.Test)

Example 4 with TaskQueue

use of org.apache.druid.indexing.overlord.TaskQueue in project druid by druid-io.

the class KafkaSupervisorTest method setupTest.

@Before
public void setupTest() {
    taskStorage = createMock(TaskStorage.class);
    taskMaster = createMock(TaskMaster.class);
    taskRunner = createMock(TaskRunner.class);
    indexerMetadataStorageCoordinator = createMock(IndexerMetadataStorageCoordinator.class);
    taskClient = createMock(KafkaIndexTaskClient.class);
    taskQueue = createMock(TaskQueue.class);
    topic = getTopic();
    rowIngestionMetersFactory = new TestUtils().getRowIngestionMetersFactory();
    serviceEmitter = new ExceptionCapturingServiceEmitter();
    EmittingLogger.registerEmitter(serviceEmitter);
    supervisorConfig = new SupervisorStateManagerConfig();
    ingestionSchema = EasyMock.createMock(KafkaSupervisorIngestionSpec.class);
}
Also used : IndexerMetadataStorageCoordinator(org.apache.druid.indexing.overlord.IndexerMetadataStorageCoordinator) TestUtils(org.apache.druid.indexing.common.TestUtils) TaskStorage(org.apache.druid.indexing.overlord.TaskStorage) SupervisorStateManagerConfig(org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig) KafkaIndexTaskClient(org.apache.druid.indexing.kafka.KafkaIndexTaskClient) TaskQueue(org.apache.druid.indexing.overlord.TaskQueue) TaskMaster(org.apache.druid.indexing.overlord.TaskMaster) ExceptionCapturingServiceEmitter(org.apache.druid.server.metrics.ExceptionCapturingServiceEmitter) TaskRunner(org.apache.druid.indexing.overlord.TaskRunner) Before(org.junit.Before)

Example 5 with TaskQueue

use of org.apache.druid.indexing.overlord.TaskQueue in project druid by druid-io.

the class MaterializedViewSupervisorTest method setUp.

@Before
public void setUp() {
    TestDerbyConnector derbyConnector = derbyConnectorRule.getConnector();
    derbyConnector.createDataSourceTable();
    derbyConnector.createSegmentTable();
    taskStorage = EasyMock.createMock(TaskStorage.class);
    taskMaster = EasyMock.createMock(TaskMaster.class);
    indexerMetadataStorageCoordinator = new IndexerSQLMetadataStorageCoordinator(objectMapper, derbyConnectorRule.metadataTablesConfigSupplier().get(), derbyConnector);
    metadataSupervisorManager = EasyMock.createMock(MetadataSupervisorManager.class);
    sqlSegmentsMetadataManager = EasyMock.createMock(SqlSegmentsMetadataManager.class);
    taskQueue = EasyMock.createMock(TaskQueue.class);
    taskQueue.start();
    objectMapper.registerSubtypes(new NamedType(HashBasedNumberedShardSpec.class, "hashed"));
    spec = new MaterializedViewSupervisorSpec("base", new DimensionsSpec(Collections.singletonList(new StringDimensionSchema("dim"))), new AggregatorFactory[] { new LongSumAggregatorFactory("m1", "m1") }, HadoopTuningConfig.makeDefaultTuningConfig(), null, null, null, null, null, false, objectMapper, taskMaster, taskStorage, metadataSupervisorManager, sqlSegmentsMetadataManager, indexerMetadataStorageCoordinator, new MaterializedViewTaskConfig(), EasyMock.createMock(AuthorizerMapper.class), EasyMock.createMock(ChatHandlerProvider.class), new SupervisorStateManagerConfig());
    derivativeDatasourceName = spec.getDataSourceName();
    supervisor = (MaterializedViewSupervisor) spec.createSupervisor();
}
Also used : IndexerSQLMetadataStorageCoordinator(org.apache.druid.metadata.IndexerSQLMetadataStorageCoordinator) HashBasedNumberedShardSpec(org.apache.druid.timeline.partition.HashBasedNumberedShardSpec) NamedType(com.fasterxml.jackson.databind.jsontype.NamedType) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) TestDerbyConnector(org.apache.druid.metadata.TestDerbyConnector) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) AggregatorFactory(org.apache.druid.query.aggregation.AggregatorFactory) StringDimensionSchema(org.apache.druid.data.input.impl.StringDimensionSchema) TaskStorage(org.apache.druid.indexing.overlord.TaskStorage) SupervisorStateManagerConfig(org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig) TaskQueue(org.apache.druid.indexing.overlord.TaskQueue) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) MetadataSupervisorManager(org.apache.druid.metadata.MetadataSupervisorManager) TaskMaster(org.apache.druid.indexing.overlord.TaskMaster) SqlSegmentsMetadataManager(org.apache.druid.metadata.SqlSegmentsMetadataManager) Before(org.junit.Before)

Aggregations

TaskQueue (org.apache.druid.indexing.overlord.TaskQueue)9 TaskMaster (org.apache.druid.indexing.overlord.TaskMaster)4 TaskStorage (org.apache.druid.indexing.overlord.TaskStorage)4 SupervisorStateManagerConfig (org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig)4 Before (org.junit.Before)4 TestUtils (org.apache.druid.indexing.common.TestUtils)3 IndexerMetadataStorageCoordinator (org.apache.druid.indexing.overlord.IndexerMetadataStorageCoordinator)3 TaskRunner (org.apache.druid.indexing.overlord.TaskRunner)3 Test (org.junit.Test)3 ImmutableMap (com.google.common.collect.ImmutableMap)2 Map (java.util.Map)2 Response (javax.ws.rs.core.Response)2 SeekableStreamIndexTaskRunner (org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner)2 EntryExistsException (org.apache.druid.metadata.EntryExistsException)2 ExceptionCapturingServiceEmitter (org.apache.druid.server.metrics.ExceptionCapturingServiceEmitter)2 NamedType (com.fasterxml.jackson.databind.jsontype.NamedType)1 Int2ObjectLinkedOpenHashMap (it.unimi.dsi.fastutil.ints.Int2ObjectLinkedOpenHashMap)1 File (java.io.File)1 HashMap (java.util.HashMap)1 ConcurrentHashMap (java.util.concurrent.ConcurrentHashMap)1