Search in sources :

Example 1 with Streamer

use of io.crate.Streamer in project crate by crate.

the class JobExecutionContextTest method testFailureClosesAllSubContexts.

@Test
public void testFailureClosesAllSubContexts() throws Exception {
    String localNodeId = "localNodeId";
    RoutedCollectPhase collectPhase = Mockito.mock(RoutedCollectPhase.class);
    Routing routing = Mockito.mock(Routing.class);
    when(routing.containsShards(localNodeId)).thenReturn(false);
    when(collectPhase.routing()).thenReturn(routing);
    when(collectPhase.maxRowGranularity()).thenReturn(RowGranularity.DOC);
    JobExecutionContext.Builder builder = new JobExecutionContext.Builder(UUID.randomUUID(), coordinatorNode, Collections.emptyList(), mock(JobsLogs.class));
    JobCollectContext jobCollectContext = new JobCollectContext(collectPhase, mock(MapSideDataCollectOperation.class), localNodeId, mock(RamAccountingContext.class), new TestingBatchConsumer(), mock(SharedShardContexts.class));
    TestingBatchConsumer batchConsumer = new TestingBatchConsumer();
    PageDownstreamContext pageDownstreamContext = spy(new PageDownstreamContext(Loggers.getLogger(PageDownstreamContext.class), "n1", 2, "dummy", batchConsumer, PassThroughPagingIterator.oneShot(), new Streamer[] { IntegerType.INSTANCE.streamer() }, mock(RamAccountingContext.class), 1));
    builder.addSubContext(jobCollectContext);
    builder.addSubContext(pageDownstreamContext);
    JobExecutionContext jobExecutionContext = builder.build();
    Exception failure = new Exception("failure!");
    jobCollectContext.close(failure);
    // other contexts must be killed with same failure
    verify(pageDownstreamContext, times(1)).innerKill(failure);
    final Field subContexts = JobExecutionContext.class.getDeclaredField("subContexts");
    subContexts.setAccessible(true);
    int size = ((ConcurrentMap<Integer, ExecutionSubContext>) subContexts.get(jobExecutionContext)).size();
    assertThat(size, is(0));
}
Also used : RamAccountingContext(io.crate.breaker.RamAccountingContext) MapSideDataCollectOperation(io.crate.operation.collect.MapSideDataCollectOperation) ConcurrentMap(java.util.concurrent.ConcurrentMap) Routing(io.crate.metadata.Routing) JobCollectContext(io.crate.operation.collect.JobCollectContext) Field(java.lang.reflect.Field) SharedShardContexts(io.crate.action.job.SharedShardContexts) Streamer(io.crate.Streamer) TestingBatchConsumer(io.crate.testing.TestingBatchConsumer) JobsLogs(io.crate.operation.collect.stats.JobsLogs) RoutedCollectPhase(io.crate.planner.node.dql.RoutedCollectPhase) Test(org.junit.Test) CrateUnitTest(io.crate.test.integration.CrateUnitTest)

Example 2 with Streamer

use of io.crate.Streamer in project crate by crate.

the class TDigestStateTest method testStreaming.

@Test
public void testStreaming() throws Exception {
    TDigestState digestState1 = new TDigestState(250, new double[] { 0.5, 0.8 });
    BytesStreamOutput out = new BytesStreamOutput();
    TDigestStateType digestStateType = TDigestStateType.INSTANCE;
    Streamer streamer = digestStateType.create().streamer();
    streamer.writeValueTo(out, digestState1);
    StreamInput in = StreamInput.wrap(out.bytes());
    TDigestState digestState2 = (TDigestState) streamer.readValueFrom(in);
    assertEquals(digestState1.compression(), digestState2.compression(), 0.001d);
    assertEquals(digestState1.fractions()[0], digestState2.fractions()[0], 0.001d);
    assertEquals(digestState1.fractions()[1], digestState2.fractions()[1], 0.001d);
}
Also used : Streamer(io.crate.Streamer) StreamInput(org.elasticsearch.common.io.stream.StreamInput) BytesStreamOutput(org.elasticsearch.common.io.stream.BytesStreamOutput) Test(org.junit.Test)

Example 3 with Streamer

use of io.crate.Streamer in project crate by crate.

the class DistributedResultRequestTest method testStreaming.

@Test
public void testStreaming() throws Exception {
    Streamer<?>[] streamers = new Streamer[] { DataTypes.STRING.streamer() };
    Object[][] rows = new Object[][] { { new BytesRef("ab") }, { null }, { new BytesRef("cd") } };
    UUID uuid = UUID.randomUUID();
    DistributedResultRequest r1 = new DistributedResultRequest(uuid, 1, (byte) 3, 1, streamers, new ArrayBucket(rows), false);
    BytesStreamOutput out = new BytesStreamOutput();
    r1.writeTo(out);
    StreamInput in = StreamInput.wrap(out.bytes());
    DistributedResultRequest r2 = new DistributedResultRequest();
    r2.readFrom(in);
    r2.streamers(streamers);
    assertTrue(r2.rowsCanBeRead());
    assertEquals(r1.rows().size(), r2.rows().size());
    assertThat(r1.isLast(), is(r2.isLast()));
    assertThat(r1.executionPhaseInputId(), is(r2.executionPhaseInputId()));
    assertThat(r2.rows(), contains(isRow("ab"), isNullRow(), isRow("cd")));
}
Also used : ArrayBucket(io.crate.data.ArrayBucket) Streamer(io.crate.Streamer) DistributedResultRequest(io.crate.executor.transport.distributed.DistributedResultRequest) StreamInput(org.elasticsearch.common.io.stream.StreamInput) UUID(java.util.UUID) BytesRef(org.apache.lucene.util.BytesRef) BytesStreamOutput(org.elasticsearch.common.io.stream.BytesStreamOutput) Test(org.junit.Test) CrateUnitTest(io.crate.test.integration.CrateUnitTest)

Example 4 with Streamer

use of io.crate.Streamer in project crate by crate.

the class ShardUpsertRequest method writeTo.

@Override
public void writeTo(StreamOutput out) throws IOException {
    super.writeTo(out);
    // Stream References
    if (updateColumns != null) {
        out.writeVInt(updateColumns.length);
        for (String column : updateColumns) {
            out.writeString(column);
        }
    } else {
        out.writeVInt(0);
    }
    Streamer[] insertValuesStreamer = null;
    if (insertColumns != null) {
        out.writeVInt(insertColumns.length);
        for (Reference reference : insertColumns) {
            Reference.toStream(reference, out);
        }
        insertValuesStreamer = Symbols.streamerArray(List.of(insertColumns));
    } else {
        out.writeVInt(0);
    }
    out.writeBoolean(continueOnError);
    out.writeVInt(duplicateKeyAction.ordinal());
    out.writeBoolean(validateConstraints);
    sessionSettings.writeTo(out);
    out.writeVInt(items.size());
    for (Item item : items) {
        item.writeTo(out, insertValuesStreamer);
    }
    if (out.getVersion().onOrAfter(Version.V_4_2_0)) {
        if (returnValues != null) {
            out.writeVInt(returnValues.length);
            for (Symbol returnValue : returnValues) {
                Symbols.toStream(returnValue, out);
            }
        } else {
            out.writeVInt(0);
        }
    }
}
Also used : Streamer(io.crate.Streamer) Reference(io.crate.metadata.Reference) BytesReference(org.elasticsearch.common.bytes.BytesReference) Symbol(io.crate.expression.symbol.Symbol)

Example 5 with Streamer

use of io.crate.Streamer in project crate by crate.

the class FetchProjection method generateStreamersGroupedByReaderAndNode.

@SuppressWarnings({ "rawtypes" })
public Map<String, ? extends IntObjectMap<Streamer[]>> generateStreamersGroupedByReaderAndNode() {
    HashMap<String, IntObjectHashMap<Streamer[]>> streamersByReaderByNode = new HashMap<>();
    for (Map.Entry<String, IntSet> entry : nodeReaders.entrySet()) {
        IntObjectHashMap<Streamer[]> streamersByReaderId = new IntObjectHashMap<>();
        String nodeId = entry.getKey();
        streamersByReaderByNode.put(nodeId, streamersByReaderId);
        for (IntCursor readerIdCursor : entry.getValue()) {
            int readerId = readerIdCursor.value;
            String index = readerIndices.floorEntry(readerId).getValue();
            RelationName relationName = indicesToIdents.get(index);
            FetchSource fetchSource = fetchSources.get(relationName);
            if (fetchSource == null) {
                continue;
            }
            streamersByReaderId.put(readerIdCursor.value, Symbols.streamerArray(fetchSource.references()));
        }
    }
    return streamersByReaderByNode;
}
Also used : FetchSource(io.crate.planner.node.fetch.FetchSource) HashMap(java.util.HashMap) IntObjectHashMap(com.carrotsearch.hppc.IntObjectHashMap) IntObjectHashMap(com.carrotsearch.hppc.IntObjectHashMap) IntSet(com.carrotsearch.hppc.IntSet) Streamer(io.crate.Streamer) IntCursor(com.carrotsearch.hppc.cursors.IntCursor) RelationName(io.crate.metadata.RelationName) HashMap(java.util.HashMap) IntObjectMap(com.carrotsearch.hppc.IntObjectMap) TreeMap(java.util.TreeMap) IntObjectHashMap(com.carrotsearch.hppc.IntObjectHashMap) Map(java.util.Map)

Aggregations

Streamer (io.crate.Streamer)20 Test (org.junit.Test)13 BytesStreamOutput (org.elasticsearch.common.io.stream.BytesStreamOutput)6 StreamInput (org.elasticsearch.common.io.stream.StreamInput)6 Row (io.crate.data.Row)4 DistResultRXTask (io.crate.execution.jobs.DistResultRXTask)4 CrateUnitTest (io.crate.test.integration.CrateUnitTest)4 TestingRowConsumer (io.crate.testing.TestingRowConsumer)4 ArrayList (java.util.ArrayList)3 Map (java.util.Map)3 UUID (java.util.UUID)3 IntObjectHashMap (com.carrotsearch.hppc.IntObjectHashMap)2 IntObjectMap (com.carrotsearch.hppc.IntObjectMap)2 IntCursor (com.carrotsearch.hppc.cursors.IntCursor)2 BlockBasedRamAccounting (io.crate.breaker.BlockBasedRamAccounting)2 PageDownstreamContext (io.crate.jobs.PageDownstreamContext)2 Reference (io.crate.metadata.Reference)2 RelationName (io.crate.metadata.RelationName)2 Routing (io.crate.metadata.Routing)2 Schemas (io.crate.metadata.Schemas)2