Search in sources :

Example 1 with PartitionsSpec

use of io.druid.indexer.partitions.PartitionsSpec in project druid by druid-io.

the class HadoopIngestionSpecTest method testPartitionsSpecMaxPartitionSize.

@Test
public void testPartitionsSpecMaxPartitionSize() {
    final HadoopIngestionSpec schema;
    try {
        schema = jsonReadWriteRead("{\n" + "    \"tuningConfig\": {\n" + "        \"type\": \"hadoop\",\n" + "        \"partitionsSpec\": {\n" + "            \"type\": \"dimension\",\n" + "            \"targetPartitionSize\": 100,\n" + "            \"maxPartitionSize\" : 200,\n" + "            \"partitionDimension\" : \"foo\"\n" + "        }\n" + "    }\n" + "}", HadoopIngestionSpec.class);
    } catch (Exception e) {
        throw Throwables.propagate(e);
    }
    final PartitionsSpec partitionsSpec = schema.getTuningConfig().getPartitionsSpec();
    Assert.assertEquals("isDeterminingPartitions", partitionsSpec.isDeterminingPartitions(), true);
    Assert.assertEquals("getTargetPartitionSize", partitionsSpec.getTargetPartitionSize(), 100);
    Assert.assertEquals("getMaxPartitionSize", partitionsSpec.getMaxPartitionSize(), 200);
    Assert.assertTrue("partitionsSpec", partitionsSpec instanceof SingleDimensionPartitionsSpec);
    Assert.assertEquals("getPartitionDimension", ((SingleDimensionPartitionsSpec) partitionsSpec).getPartitionDimension(), "foo");
}
Also used : HashedPartitionsSpec(io.druid.indexer.partitions.HashedPartitionsSpec) SingleDimensionPartitionsSpec(io.druid.indexer.partitions.SingleDimensionPartitionsSpec) PartitionsSpec(io.druid.indexer.partitions.PartitionsSpec) SingleDimensionPartitionsSpec(io.druid.indexer.partitions.SingleDimensionPartitionsSpec) JsonProcessingException(com.fasterxml.jackson.core.JsonProcessingException) Test(org.junit.Test)

Example 2 with PartitionsSpec

use of io.druid.indexer.partitions.PartitionsSpec in project druid by druid-io.

the class HadoopIngestionSpecTest method testPartitionsSpecAutoHashed.

@Test
public void testPartitionsSpecAutoHashed() {
    final HadoopIngestionSpec schema;
    try {
        schema = jsonReadWriteRead("{\n" + "    \"tuningConfig\": {\n" + "        \"type\": \"hadoop\",\n" + "        \"partitionsSpec\": {\n" + "            \"targetPartitionSize\": 100\n" + "        }\n" + "    }\n" + "}", HadoopIngestionSpec.class);
    } catch (Exception e) {
        throw Throwables.propagate(e);
    }
    final PartitionsSpec partitionsSpec = schema.getTuningConfig().getPartitionsSpec();
    Assert.assertEquals("isDeterminingPartitions", partitionsSpec.isDeterminingPartitions(), true);
    Assert.assertEquals("getTargetPartitionSize", partitionsSpec.getTargetPartitionSize(), 100);
    Assert.assertTrue("partitionSpec", partitionsSpec instanceof HashedPartitionsSpec);
}
Also used : HashedPartitionsSpec(io.druid.indexer.partitions.HashedPartitionsSpec) HashedPartitionsSpec(io.druid.indexer.partitions.HashedPartitionsSpec) SingleDimensionPartitionsSpec(io.druid.indexer.partitions.SingleDimensionPartitionsSpec) PartitionsSpec(io.druid.indexer.partitions.PartitionsSpec) JsonProcessingException(com.fasterxml.jackson.core.JsonProcessingException) Test(org.junit.Test)

Aggregations

JsonProcessingException (com.fasterxml.jackson.core.JsonProcessingException)2 HashedPartitionsSpec (io.druid.indexer.partitions.HashedPartitionsSpec)2 PartitionsSpec (io.druid.indexer.partitions.PartitionsSpec)2 SingleDimensionPartitionsSpec (io.druid.indexer.partitions.SingleDimensionPartitionsSpec)2 Test (org.junit.Test)2