Search in sources :

Example 1 with ListBatchReader

use of cz.o2.proxima.direct.storage.ListBatchReader in project proxima-platform by O2-Czech-Republic.

the class BatchLogReadTest method testBatchLogReadWithLimit.

@Test(timeout = 60000)
public void testBatchLogReadWithLimit() {
    int numElements = 1000;
    List<StreamElement> input = createInput(numElements);
    ListBatchReader reader = ListBatchReader.of(direct.getContext(), input);
    testReadingFromBatchLogMany(50, BatchLogRead.of(Collections.singletonList(this.data), 50, repo, reader));
}
Also used : ListBatchReader(cz.o2.proxima.direct.storage.ListBatchReader) StreamElement(cz.o2.proxima.storage.StreamElement) Test(org.junit.Test)

Example 2 with ListBatchReader

use of cz.o2.proxima.direct.storage.ListBatchReader in project proxima-platform by O2-Czech-Republic.

the class BatchLogReadTest method testReadingFromBatchLog.

@Test(timeout = 30000)
public void testReadingFromBatchLog() {
    List<StreamElement> data = createInput(1);
    ListBatchReader reader = ListBatchReader.of(context, data);
    testReadingFromBatchLog(Collections.singletonList(this.data), reader);
}
Also used : ListBatchReader(cz.o2.proxima.direct.storage.ListBatchReader) StreamElement(cz.o2.proxima.storage.StreamElement) Test(org.junit.Test)

Example 3 with ListBatchReader

use of cz.o2.proxima.direct.storage.ListBatchReader in project proxima-platform by O2-Czech-Republic.

the class BatchLogSourceFunctionTest method testRunAndClose.

@Test
void testRunAndClose() throws Exception {
    final Repository repository = Repository.ofTest(ConfigFactory.parseString(MODEL));
    final AttributeDescriptor<?> attribute = repository.getEntity("test").getAttribute("data");
    final BatchLogSourceFunction<StreamElement> sourceFunction = new BatchLogSourceFunction<StreamElement>(repository.asFactory(), Collections.singletonList(attribute), ResultExtractor.identity()) {

        @Override
        BatchLogReader createLogReader(List<AttributeDescriptor<?>> attributeDescriptors) {
            final DirectDataOperator direct = repository.getOrCreateOperator(DirectDataOperator.class);
            final ListBatchReader reader = ListBatchReader.ofPartitioned(direct.getContext());
            return OffsetTrackingBatchLogReader.of(reader);
        }
    };
    final AbstractStreamOperatorTestHarness<StreamElement> testHarness = createTestHarness(sourceFunction, 1, 0);
    testHarness.initializeEmptyState();
    testHarness.open();
    final CheckedThread runThread = new CheckedThread("run") {

        @Override
        public void go() throws Exception {
            sourceFunction.run(new TestSourceContext<StreamElement>() {

                @Override
                public void collect(StreamElement element) {
                // No-op.
                }
            });
        }
    };
    runThread.start();
    sourceFunction.awaitRunning();
    sourceFunction.cancel();
    testHarness.close();
    // Make sure run thread finishes normally.
    runThread.sync();
}
Also used : DirectDataOperator(cz.o2.proxima.direct.core.DirectDataOperator) Repository(cz.o2.proxima.repository.Repository) ListBatchReader(cz.o2.proxima.direct.storage.ListBatchReader) StreamElement(cz.o2.proxima.storage.StreamElement) ArrayList(java.util.ArrayList) List(java.util.List) CheckedThread(org.apache.flink.core.testutils.CheckedThread) Test(org.junit.jupiter.api.Test)

Example 4 with ListBatchReader

use of cz.o2.proxima.direct.storage.ListBatchReader in project proxima-platform by O2-Czech-Republic.

the class BatchLogReaderTest method testObserveOffsets.

@Test
public void testObserveOffsets() throws InterruptedException {
    final List<StreamElement> firstPartition = newPartition("first_", 100);
    final List<StreamElement> secondPartition = newPartition("second_", 80);
    final List<StreamElement> thirdPartition = newPartition("third_", 60);
    final ListBatchReader reader = ListBatchReader.ofPartitioned(direct.getContext(), Arrays.asList(firstPartition, secondPartition, thirdPartition));
    final BlockingQueue<String> consumed = new LinkedBlockingQueue<>();
    final CountDownLatch doneConsuming = new CountDownLatch(1);
    reader.observeOffsets(Arrays.asList(Offset.of(Partition.of(0), 50, false), Offset.of(Partition.of(2), 40, false)), Collections.singletonList(attr), new BatchLogObserver() {

        @Override
        public boolean onNext(StreamElement element) {
            assertTrue(consumed.add(element.getKey()));
            return true;
        }

        @Override
        public void onCompleted() {
            doneConsuming.countDown();
        }
    });
    doneConsuming.await();
    final Set<String> expected = Streams.concat(firstPartition.subList(50, firstPartition.size()).stream(), thirdPartition.subList(40, thirdPartition.size()).stream()).map(StreamElement::getKey).collect(Collectors.toSet());
    assertEquals(expected, new HashSet<>(consumed));
}
Also used : ListBatchReader(cz.o2.proxima.direct.storage.ListBatchReader) StreamElement(cz.o2.proxima.storage.StreamElement) LinkedBlockingQueue(java.util.concurrent.LinkedBlockingQueue) CountDownLatch(java.util.concurrent.CountDownLatch) Test(org.junit.Test)

Example 5 with ListBatchReader

use of cz.o2.proxima.direct.storage.ListBatchReader in project proxima-platform by O2-Czech-Republic.

the class BatchLogReaderTest method testObserveReadOffset.

@Test
public void testObserveReadOffset() throws InterruptedException {
    final List<StreamElement> firstPartition = newPartition("first_", 10);
    final List<StreamElement> secondPartition = newPartition("second_", 20);
    final List<StreamElement> thirdPartition = newPartition("third_", 30);
    final ListBatchReader reader = ListBatchReader.ofPartitioned(direct.getContext(), Arrays.asList(firstPartition, secondPartition, thirdPartition));
    final ConcurrentMap<Partition, Offset> lastOffsets = new ConcurrentHashMap<>();
    final CountDownLatch doneConsuming = new CountDownLatch(1);
    reader.observe(Arrays.asList(Partition.of(0), Partition.of(1), Partition.of(2)), Collections.singletonList(attr), new BatchLogObserver() {

        @Override
        public boolean onNext(StreamElement element, OnNextContext context) {
            lastOffsets.merge(context.getPartition(), context.getOffset(), (oldValue, newValue) -> {
                assertTrue(oldValue.getElementIndex() < newValue.getElementIndex());
                return newValue;
            });
            return true;
        }

        @Override
        public void onCompleted() {
            doneConsuming.countDown();
        }
    });
    doneConsuming.await();
    assertEquals(Offset.of(Partition.of(0), 9, true), lastOffsets.get(Partition.of(0)));
    assertEquals(Offset.of(Partition.of(1), 19, true), lastOffsets.get(Partition.of(1)));
    assertEquals(Offset.of(Partition.of(2), 29, true), lastOffsets.get(Partition.of(2)));
}
Also used : IntStream(java.util.stream.IntStream) Arrays(java.util.Arrays) Partition(cz.o2.proxima.storage.Partition) EntityDescriptor(cz.o2.proxima.repository.EntityDescriptor) ArrayList(java.util.ArrayList) ConcurrentMap(java.util.concurrent.ConcurrentMap) HashSet(java.util.HashSet) ExceptionUtils(cz.o2.proxima.util.ExceptionUtils) StreamElement(cz.o2.proxima.storage.StreamElement) After(org.junit.After) ConfigFactory(com.typesafe.config.ConfigFactory) ListBatchReader(cz.o2.proxima.direct.storage.ListBatchReader) Optionals(cz.o2.proxima.util.Optionals) Before(org.junit.Before) Repository(cz.o2.proxima.repository.Repository) SynchronousQueue(java.util.concurrent.SynchronousQueue) AttributeDescriptor(cz.o2.proxima.repository.AttributeDescriptor) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Set(java.util.Set) BlockingQueue(java.util.concurrent.BlockingQueue) Test(org.junit.Test) UUID(java.util.UUID) Streams(com.google.common.collect.Streams) CommitCallback(cz.o2.proxima.direct.core.CommitCallback) LinkedBlockingQueue(java.util.concurrent.LinkedBlockingQueue) Collectors(java.util.stream.Collectors) CountDownLatch(java.util.concurrent.CountDownLatch) ReplicationRunner(cz.o2.proxima.util.ReplicationRunner) List(java.util.List) CommitLogReaderTest.withNumRecordsPerSec(cz.o2.proxima.direct.commitlog.CommitLogReaderTest.withNumRecordsPerSec) DirectDataOperator(cz.o2.proxima.direct.core.DirectDataOperator) Assert(org.junit.Assert) Collections(java.util.Collections) Partition(cz.o2.proxima.storage.Partition) ListBatchReader(cz.o2.proxima.direct.storage.ListBatchReader) StreamElement(cz.o2.proxima.storage.StreamElement) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) CountDownLatch(java.util.concurrent.CountDownLatch) Test(org.junit.Test)

Aggregations

StreamElement (cz.o2.proxima.storage.StreamElement)9 ListBatchReader (cz.o2.proxima.direct.storage.ListBatchReader)7 Test (org.junit.Test)7 ArrayList (java.util.ArrayList)5 CountDownLatch (java.util.concurrent.CountDownLatch)5 BatchLogObserver (cz.o2.proxima.direct.batch.BatchLogObserver)3 DirectDataOperator (cz.o2.proxima.direct.core.DirectDataOperator)3 List (java.util.List)3 Repository (cz.o2.proxima.repository.Repository)2 LinkedBlockingQueue (java.util.concurrent.LinkedBlockingQueue)2 CheckedThread (org.apache.flink.core.testutils.CheckedThread)2 Streams (com.google.common.collect.Streams)1 ConfigFactory (com.typesafe.config.ConfigFactory)1 CommitLogReaderTest.withNumRecordsPerSec (cz.o2.proxima.direct.commitlog.CommitLogReaderTest.withNumRecordsPerSec)1 CommitCallback (cz.o2.proxima.direct.core.CommitCallback)1 AttributeDescriptor (cz.o2.proxima.repository.AttributeDescriptor)1 EntityDescriptor (cz.o2.proxima.repository.EntityDescriptor)1 Partition (cz.o2.proxima.storage.Partition)1 ExceptionUtils (cz.o2.proxima.util.ExceptionUtils)1 Optionals (cz.o2.proxima.util.Optionals)1