Search in sources :

Example 1 with IAnswer

use of org.easymock.IAnswer in project kafka by apache.

the class KafkaStatusBackingStoreTest method putConnectorStateRetriableFailure.

@Test
public void putConnectorStateRetriableFailure() {
    KafkaBasedLog<String, byte[]> kafkaBasedLog = mock(KafkaBasedLog.class);
    Converter converter = mock(Converter.class);
    KafkaStatusBackingStore store = new KafkaStatusBackingStore(new MockTime(), converter, STATUS_TOPIC, kafkaBasedLog);
    byte[] value = new byte[0];
    expect(converter.fromConnectData(eq(STATUS_TOPIC), anyObject(Schema.class), anyObject(Struct.class))).andStubReturn(value);
    final Capture<Callback> callbackCapture = newCapture();
    kafkaBasedLog.send(eq("status-connector-conn"), eq(value), capture(callbackCapture));
    expectLastCall().andAnswer(new IAnswer<Void>() {

        @Override
        public Void answer() throws Throwable {
            callbackCapture.getValue().onCompletion(null, new TimeoutException());
            return null;
        }
    }).andAnswer(new IAnswer<Void>() {

        @Override
        public Void answer() throws Throwable {
            callbackCapture.getValue().onCompletion(null, null);
            return null;
        }
    });
    replayAll();
    ConnectorStatus status = new ConnectorStatus(CONNECTOR, ConnectorStatus.State.RUNNING, WORKER_ID, 0);
    store.put(status);
    // state is not visible until read back from the log
    assertEquals(null, store.get(CONNECTOR));
    verifyAll();
}
Also used : Schema(org.apache.kafka.connect.data.Schema) ConnectorStatus(org.apache.kafka.connect.runtime.ConnectorStatus) Struct(org.apache.kafka.connect.data.Struct) IAnswer(org.easymock.IAnswer) Callback(org.apache.kafka.clients.producer.Callback) MockTime(org.apache.kafka.common.utils.MockTime) TimeoutException(org.apache.kafka.common.errors.TimeoutException) Test(org.junit.Test)

Example 2 with IAnswer

use of org.easymock.IAnswer in project kafka by apache.

the class WorkerSourceTaskTest method expectSendRecord.

private Capture<ProducerRecord<byte[], byte[]>> expectSendRecord(boolean anyTimes, boolean isRetry, boolean succeed) throws InterruptedException {
    expectConvertKeyValue(anyTimes);
    expectApplyTransformationChain(anyTimes);
    Capture<ProducerRecord<byte[], byte[]>> sent = EasyMock.newCapture();
    // 1. Offset data is passed to the offset storage.
    if (!isRetry) {
        offsetWriter.offset(PARTITION, OFFSET);
        if (anyTimes)
            PowerMock.expectLastCall().anyTimes();
        else
            PowerMock.expectLastCall();
    }
    // 2. Converted data passed to the producer, which will need callbacks invoked for flush to work
    IExpectationSetters<Future<RecordMetadata>> expect = EasyMock.expect(producer.send(EasyMock.capture(sent), EasyMock.capture(producerCallbacks)));
    IAnswer<Future<RecordMetadata>> expectResponse = new IAnswer<Future<RecordMetadata>>() {

        @Override
        public Future<RecordMetadata> answer() throws Throwable {
            synchronized (producerCallbacks) {
                for (org.apache.kafka.clients.producer.Callback cb : producerCallbacks.getValues()) {
                    cb.onCompletion(new RecordMetadata(new TopicPartition("foo", 0), 0, 0, 0L, 0L, 0, 0), null);
                }
                producerCallbacks.reset();
            }
            return sendFuture;
        }
    };
    if (anyTimes)
        expect.andStubAnswer(expectResponse);
    else
        expect.andAnswer(expectResponse);
    // 3. As a result of a successful producer send callback, we'll notify the source task of the record commit
    expectTaskCommitRecord(anyTimes, succeed);
    return sent;
}
Also used : RecordMetadata(org.apache.kafka.clients.producer.RecordMetadata) IAnswer(org.easymock.IAnswer) TopicPartition(org.apache.kafka.common.TopicPartition) ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) Future(java.util.concurrent.Future)

Example 3 with IAnswer

use of org.easymock.IAnswer in project druid by druid-io.

the class CachingClusteredClientTest method testQueryCachingWithFilter.

@SuppressWarnings("unchecked")
public void testQueryCachingWithFilter(final QueryRunner runner, final int numTimesToQuery, final Query query, final List<Iterable<Result<TimeseriesResultValue>>> filteredExpected, // does this assume query intervals must be ordered?
Object... args) {
    final List<Interval> queryIntervals = Lists.newArrayListWithCapacity(args.length / 2);
    final List<List<Iterable<Result<Object>>>> expectedResults = Lists.newArrayListWithCapacity(queryIntervals.size());
    parseResults(queryIntervals, expectedResults, args);
    for (int i = 0; i < queryIntervals.size(); ++i) {
        List<Object> mocks = Lists.newArrayList();
        mocks.add(serverView);
        final Interval actualQueryInterval = new Interval(queryIntervals.get(0).getStart(), queryIntervals.get(i).getEnd());
        final List<Map<DruidServer, ServerExpectations>> serverExpectationList = populateTimeline(queryIntervals, expectedResults, i, mocks);
        final Map<DruidServer, ServerExpectations> finalExpectation = serverExpectationList.get(serverExpectationList.size() - 1);
        for (Map.Entry<DruidServer, ServerExpectations> entry : finalExpectation.entrySet()) {
            DruidServer server = entry.getKey();
            ServerExpectations expectations = entry.getValue();
            EasyMock.expect(serverView.getQueryRunner(server)).andReturn(expectations.getQueryRunner()).times(0, 1);
            final Capture<? extends Query> capture = new Capture();
            final Capture<? extends Map> context = new Capture();
            QueryRunner queryable = expectations.getQueryRunner();
            if (query instanceof TimeseriesQuery) {
                final List<String> segmentIds = Lists.newArrayList();
                final List<Iterable<Result<TimeseriesResultValue>>> results = Lists.newArrayList();
                for (ServerExpectation expectation : expectations) {
                    segmentIds.add(expectation.getSegmentId());
                    results.add(expectation.getResults());
                }
                EasyMock.expect(queryable.run(EasyMock.capture(capture), EasyMock.capture(context))).andAnswer(new IAnswer<Sequence>() {

                    @Override
                    public Sequence answer() throws Throwable {
                        return toFilteredQueryableTimeseriesResults((TimeseriesQuery) capture.getValue(), segmentIds, queryIntervals, results);
                    }
                }).times(0, 1);
            } else {
                throw new ISE("Unknown query type[%s]", query.getClass());
            }
        }
        final Iterable<Result<Object>> expected = new ArrayList<>();
        for (int intervalNo = 0; intervalNo < i + 1; intervalNo++) {
            Iterables.addAll((List) expected, filteredExpected.get(intervalNo));
        }
        runWithMocks(new Runnable() {

            @Override
            public void run() {
                HashMap<String, List> context = new HashMap<String, List>();
                for (int i = 0; i < numTimesToQuery; ++i) {
                    TestHelper.assertExpectedResults(expected, runner.run(query.withQuerySegmentSpec(new MultipleIntervalSegmentSpec(ImmutableList.of(actualQueryInterval))), context));
                    if (queryCompletedCallback != null) {
                        queryCompletedCallback.run();
                    }
                }
            }
        }, mocks.toArray());
    }
}
Also used : TimeseriesResultValue(io.druid.query.timeseries.TimeseriesResultValue) MergeIterable(io.druid.java.util.common.guava.MergeIterable) FunctionalIterable(io.druid.java.util.common.guava.FunctionalIterable) HashMap(java.util.HashMap) ArrayList(java.util.ArrayList) MultipleIntervalSegmentSpec(io.druid.query.spec.MultipleIntervalSegmentSpec) Capture(org.easymock.Capture) Result(io.druid.query.Result) ArrayList(java.util.ArrayList) List(java.util.List) ImmutableList(com.google.common.collect.ImmutableList) ISE(io.druid.java.util.common.ISE) TimeseriesQuery(io.druid.query.timeseries.TimeseriesQuery) QueryableDruidServer(io.druid.client.selector.QueryableDruidServer) FinalizeResultsQueryRunner(io.druid.query.FinalizeResultsQueryRunner) QueryRunner(io.druid.query.QueryRunner) IAnswer(org.easymock.IAnswer) Map(java.util.Map) TreeMap(java.util.TreeMap) ImmutableMap(com.google.common.collect.ImmutableMap) HashMap(java.util.HashMap) Interval(org.joda.time.Interval)

Example 4 with IAnswer

use of org.easymock.IAnswer in project druid by druid-io.

the class ReplayableFirehoseFactoryTest method testReplayableFirehoseWithConnectRetries.

@Test
public void testReplayableFirehoseWithConnectRetries() throws Exception {
    final boolean[] hasMore = { true };
    expect(delegateFactory.connect(parser)).andThrow(new IOException()).andReturn(delegateFirehose);
    expect(delegateFirehose.hasMore()).andAnswer(new IAnswer<Boolean>() {

        @Override
        public Boolean answer() throws Throwable {
            return hasMore[0];
        }
    }).anyTimes();
    expect(delegateFirehose.nextRow()).andReturn(testRows.get(0)).andReturn(testRows.get(1)).andAnswer(new IAnswer<InputRow>() {

        @Override
        public InputRow answer() throws Throwable {
            hasMore[0] = false;
            return testRows.get(2);
        }
    });
    delegateFirehose.close();
    replayAll();
    List<InputRow> rows = Lists.newArrayList();
    try (Firehose firehose = replayableFirehoseFactory.connect(parser)) {
        while (firehose.hasMore()) {
            rows.add(firehose.nextRow());
        }
    }
    Assert.assertEquals(testRows, rows);
    verifyAll();
}
Also used : IAnswer(org.easymock.IAnswer) Firehose(io.druid.data.input.Firehose) MapBasedInputRow(io.druid.data.input.MapBasedInputRow) InputRow(io.druid.data.input.InputRow) IOException(java.io.IOException) Test(org.junit.Test)

Example 5 with IAnswer

use of org.easymock.IAnswer in project druid by druid-io.

the class ReplayableFirehoseFactoryTest method testReplayableFirehoseWithNextRowRetries.

@Test
public void testReplayableFirehoseWithNextRowRetries() throws Exception {
    final boolean[] hasMore = { true };
    expect(delegateFactory.connect(parser)).andReturn(delegateFirehose).times(2);
    expect(delegateFirehose.hasMore()).andAnswer(new IAnswer<Boolean>() {

        @Override
        public Boolean answer() throws Throwable {
            return hasMore[0];
        }
    }).anyTimes();
    expect(delegateFirehose.nextRow()).andReturn(testRows.get(0)).andThrow(new RuntimeException()).andReturn(testRows.get(0)).andReturn(testRows.get(1)).andAnswer(new IAnswer<InputRow>() {

        @Override
        public InputRow answer() throws Throwable {
            hasMore[0] = false;
            return testRows.get(2);
        }
    });
    delegateFirehose.close();
    expectLastCall().times(2);
    replayAll();
    List<InputRow> rows = Lists.newArrayList();
    try (Firehose firehose = replayableFirehoseFactory.connect(parser)) {
        while (firehose.hasMore()) {
            rows.add(firehose.nextRow());
        }
    }
    Assert.assertEquals(testRows, rows);
    verifyAll();
}
Also used : IAnswer(org.easymock.IAnswer) Firehose(io.druid.data.input.Firehose) MapBasedInputRow(io.druid.data.input.MapBasedInputRow) InputRow(io.druid.data.input.InputRow) Test(org.junit.Test)

Aggregations

IAnswer (org.easymock.IAnswer)31 Test (org.junit.Test)17 Bundle (org.osgi.framework.Bundle)11 BundleContext (org.osgi.framework.BundleContext)11 Hashtable (java.util.Hashtable)8 ArrayList (java.util.ArrayList)6 Dictionary (java.util.Dictionary)6 ServiceReference (org.osgi.framework.ServiceReference)6 BundleWiring (org.osgi.framework.wiring.BundleWiring)6 Firehose (io.druid.data.input.Firehose)5 InputRow (io.druid.data.input.InputRow)5 MapBasedInputRow (io.druid.data.input.MapBasedInputRow)5 URL (java.net.URL)5 URLClassLoader (java.net.URLClassLoader)5 HashMap (java.util.HashMap)5 ServiceRegistrationHolder (org.apache.karaf.service.guard.impl.GuardProxyCatalog.ServiceRegistrationHolder)5 ServiceRegistration (org.osgi.framework.ServiceRegistration)5 CreateProxyRunnable (org.apache.karaf.service.guard.impl.GuardProxyCatalog.CreateProxyRunnable)4 LinkedHashMap (java.util.LinkedHashMap)3 ProxyManager (org.apache.aries.proxy.ProxyManager)3