Search in sources :

Example 91 with DimensionsSpec

use of org.apache.druid.data.input.impl.DimensionsSpec in project druid by druid-io.

the class MaterializedViewSupervisorTest method testSuspendedDoesntRun.

@Test
public void testSuspendedDoesntRun() {
    MaterializedViewSupervisorSpec suspended = new MaterializedViewSupervisorSpec("base", new DimensionsSpec(Collections.singletonList(new StringDimensionSchema("dim"))), new AggregatorFactory[] { new LongSumAggregatorFactory("m1", "m1") }, HadoopTuningConfig.makeDefaultTuningConfig(), null, null, null, null, null, true, objectMapper, taskMaster, taskStorage, metadataSupervisorManager, sqlSegmentsMetadataManager, indexerMetadataStorageCoordinator, new MaterializedViewTaskConfig(), EasyMock.createMock(AuthorizerMapper.class), EasyMock.createMock(ChatHandlerProvider.class), new SupervisorStateManagerConfig());
    MaterializedViewSupervisor supervisor = (MaterializedViewSupervisor) suspended.createSupervisor();
    // mock IndexerSQLMetadataStorageCoordinator to ensure that retrieveDataSourceMetadata is not called
    // which will be true if truly suspended, since this is the first operation of the 'run' method otherwise
    IndexerSQLMetadataStorageCoordinator mock = EasyMock.createMock(IndexerSQLMetadataStorageCoordinator.class);
    EasyMock.expect(mock.retrieveDataSourceMetadata(suspended.getDataSourceName())).andAnswer(() -> {
        Assert.fail();
        return null;
    }).anyTimes();
    EasyMock.replay(mock);
    supervisor.run();
}
Also used : IndexerSQLMetadataStorageCoordinator(org.apache.druid.metadata.IndexerSQLMetadataStorageCoordinator) SupervisorStateManagerConfig(org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) AuthorizerMapper(org.apache.druid.server.security.AuthorizerMapper) ChatHandlerProvider(org.apache.druid.segment.realtime.firehose.ChatHandlerProvider) StringDimensionSchema(org.apache.druid.data.input.impl.StringDimensionSchema) Test(org.junit.Test)

Example 92 with DimensionsSpec

use of org.apache.druid.data.input.impl.DimensionsSpec in project druid by druid-io.

the class MaterializedViewSupervisorSpecTest method testMaterializedViewSupervisorSpecCreated.

@Test
public void testMaterializedViewSupervisorSpecCreated() {
    Exception ex = null;
    try {
        MaterializedViewSupervisorSpec spec = new MaterializedViewSupervisorSpec("wikiticker", new DimensionsSpec(Lists.newArrayList(new StringDimensionSchema("isUnpatrolled"), new StringDimensionSchema("metroCode"), new StringDimensionSchema("namespace"), new StringDimensionSchema("page"), new StringDimensionSchema("regionIsoCode"), new StringDimensionSchema("regionName"), new StringDimensionSchema("user"))), new AggregatorFactory[] { new CountAggregatorFactory("count"), new LongSumAggregatorFactory("added", "added") }, HadoopTuningConfig.makeDefaultTuningConfig(), null, null, null, null, null, false, objectMapper, null, null, null, null, null, new MaterializedViewTaskConfig(), EasyMock.createMock(AuthorizerMapper.class), new NoopChatHandlerProvider(), new SupervisorStateManagerConfig());
        Supervisor supervisor = spec.createSupervisor();
        Assert.assertTrue(supervisor instanceof MaterializedViewSupervisor);
        SupervisorTaskAutoScaler autoscaler = spec.createAutoscaler(supervisor);
        Assert.assertNull(autoscaler);
        try {
            supervisor.computeLagStats();
        } catch (Exception e) {
            Assert.assertTrue(e instanceof UnsupportedOperationException);
        }
        try {
            int count = supervisor.getActiveTaskGroupsCount();
        } catch (Exception e) {
            Assert.assertTrue(e instanceof UnsupportedOperationException);
        }
        Callable<Integer> noop = new Callable<Integer>() {

            @Override
            public Integer call() {
                return -1;
            }
        };
    } catch (Exception e) {
        ex = e;
    }
    Assert.assertNull(ex);
}
Also used : Supervisor(org.apache.druid.indexing.overlord.supervisor.Supervisor) NoopChatHandlerProvider(org.apache.druid.segment.realtime.firehose.NoopChatHandlerProvider) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) ExpectedException(org.junit.rules.ExpectedException) IOException(java.io.IOException) Callable(java.util.concurrent.Callable) StringDimensionSchema(org.apache.druid.data.input.impl.StringDimensionSchema) SupervisorTaskAutoScaler(org.apache.druid.indexing.overlord.supervisor.autoscaler.SupervisorTaskAutoScaler) CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) SupervisorStateManagerConfig(org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) AuthorizerMapper(org.apache.druid.server.security.AuthorizerMapper) Test(org.junit.Test)

Example 93 with DimensionsSpec

use of org.apache.druid.data.input.impl.DimensionsSpec in project druid by druid-io.

the class MaterializedViewSupervisorSpecTest method testNullBaseDataSource.

@Test
public void testNullBaseDataSource() {
    expectedException.expect(CoreMatchers.instanceOf(IllegalArgumentException.class));
    expectedException.expectMessage("baseDataSource cannot be null or empty. Please provide a baseDataSource.");
    // noinspection ResultOfObjectAllocationIgnored (this method call will trigger the expected exception)
    new MaterializedViewSupervisorSpec(null, new DimensionsSpec(Lists.newArrayList(new StringDimensionSchema("isUnpatrolled"), new StringDimensionSchema("metroCode"), new StringDimensionSchema("namespace"), new StringDimensionSchema("page"), new StringDimensionSchema("regionIsoCode"), new StringDimensionSchema("regionName"), new StringDimensionSchema("user"))), new AggregatorFactory[] { new CountAggregatorFactory("count"), new LongSumAggregatorFactory("added", "added") }, HadoopTuningConfig.makeDefaultTuningConfig(), null, null, null, null, null, false, objectMapper, null, null, null, null, null, new MaterializedViewTaskConfig(), EasyMock.createMock(AuthorizerMapper.class), new NoopChatHandlerProvider(), new SupervisorStateManagerConfig());
}
Also used : CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) NoopChatHandlerProvider(org.apache.druid.segment.realtime.firehose.NoopChatHandlerProvider) SupervisorStateManagerConfig(org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) AuthorizerMapper(org.apache.druid.server.security.AuthorizerMapper) StringDimensionSchema(org.apache.druid.data.input.impl.StringDimensionSchema) Test(org.junit.Test)

Example 94 with DimensionsSpec

use of org.apache.druid.data.input.impl.DimensionsSpec in project druid by druid-io.

the class TimestampsParquetReaderTest method testParseInt96Timestamp.

@Test
public void testParseInt96Timestamp() throws IOException {
    // the source parquet file was found in apache spark sql repo tests, where it is known as impala_timestamp.parq
    // it has a single column, "ts" which is an int96 timestamp
    final String file = "example/timestamps/int96_timestamp.parquet";
    InputRowSchema schema = new InputRowSchema(new TimestampSpec("ts", "auto", null), new DimensionsSpec(DimensionsSpec.getDefaultSchemas(ImmutableList.of())), ColumnsFilter.all());
    InputEntityReader reader = createReader(file, schema, JSONPathSpec.DEFAULT);
    List<InputRow> rows = readAllRows(reader);
    Assert.assertEquals("2001-01-01T01:01:01.000Z", rows.get(0).getTimestamp().toString());
    reader = createReader(file, schema, JSONPathSpec.DEFAULT);
    List<InputRowListPlusRawValues> sampled = sampleAllRows(reader);
    final String expectedJson = "{\n" + "  \"ts\" : 978310861000\n" + "}";
    Assert.assertEquals(expectedJson, DEFAULT_JSON_WRITER.writeValueAsString(sampled.get(0).getRawValues()));
}
Also used : InputRowListPlusRawValues(org.apache.druid.data.input.InputRowListPlusRawValues) TimestampSpec(org.apache.druid.data.input.impl.TimestampSpec) InputRow(org.apache.druid.data.input.InputRow) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) InputRowSchema(org.apache.druid.data.input.InputRowSchema) InputEntityReader(org.apache.druid.data.input.InputEntityReader) Test(org.junit.Test)

Example 95 with DimensionsSpec

use of org.apache.druid.data.input.impl.DimensionsSpec in project druid by druid-io.

the class TimestampsParquetReaderTest method testTimeMillisInInt64.

@Test
public void testTimeMillisInInt64() throws IOException {
    final String file = "example/timestamps/timemillis-in-i64.parquet";
    InputRowSchema schema = new InputRowSchema(new TimestampSpec("time", "auto", null), new DimensionsSpec(DimensionsSpec.getDefaultSchemas(ImmutableList.of())), ColumnsFilter.all());
    InputEntityReader reader = createReader(file, schema, JSONPathSpec.DEFAULT);
    List<InputRow> rows = readAllRows(reader);
    Assert.assertEquals("1970-01-01T00:00:00.010Z", rows.get(0).getTimestamp().toString());
    reader = createReader(file, schema, JSONPathSpec.DEFAULT);
    List<InputRowListPlusRawValues> sampled = sampleAllRows(reader);
    final String expectedJson = "{\n" + "  \"time\" : 10\n" + "}";
    Assert.assertEquals(expectedJson, DEFAULT_JSON_WRITER.writeValueAsString(sampled.get(0).getRawValues()));
}
Also used : InputRowListPlusRawValues(org.apache.druid.data.input.InputRowListPlusRawValues) TimestampSpec(org.apache.druid.data.input.impl.TimestampSpec) InputRow(org.apache.druid.data.input.InputRow) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) InputRowSchema(org.apache.druid.data.input.InputRowSchema) InputEntityReader(org.apache.druid.data.input.InputEntityReader) Test(org.junit.Test)

Aggregations

DimensionsSpec (org.apache.druid.data.input.impl.DimensionsSpec)169 Test (org.junit.Test)129 TimestampSpec (org.apache.druid.data.input.impl.TimestampSpec)114 InputRow (org.apache.druid.data.input.InputRow)52 AggregatorFactory (org.apache.druid.query.aggregation.AggregatorFactory)47 LongSumAggregatorFactory (org.apache.druid.query.aggregation.LongSumAggregatorFactory)47 UniformGranularitySpec (org.apache.druid.segment.indexing.granularity.UniformGranularitySpec)42 DataSchema (org.apache.druid.segment.indexing.DataSchema)39 StringDimensionSchema (org.apache.druid.data.input.impl.StringDimensionSchema)37 CountAggregatorFactory (org.apache.druid.query.aggregation.CountAggregatorFactory)37 InputRowSchema (org.apache.druid.data.input.InputRowSchema)36 Map (java.util.Map)32 InitializedNullHandlingTest (org.apache.druid.testing.InitializedNullHandlingTest)32 InputEntityReader (org.apache.druid.data.input.InputEntityReader)31 ArrayList (java.util.ArrayList)29 CsvInputFormat (org.apache.druid.data.input.impl.CsvInputFormat)25 MapBasedInputRow (org.apache.druid.data.input.MapBasedInputRow)24 JSONPathSpec (org.apache.druid.java.util.common.parsers.JSONPathSpec)24 HashMap (java.util.HashMap)23 ImmutableMap (com.google.common.collect.ImmutableMap)21