Search in sources :

Example 51 with OperationContext

use of io.pravega.controller.store.stream.OperationContext in project pravega by pravega.

the class DeleteScopeTask method deleteScopeContent.

public CompletableFuture<Void> deleteScopeContent(String scopeName, OperationContext context, long requestId) {
    Map<String, String> readerGroupMap = new HashMap<>();
    Iterator<Stream> iterator = listStreams(scopeName, context).asIterator();
    // Seal and delete streams and add entry to RGList
    while (iterator.hasNext()) {
        Stream stream = iterator.next();
        Timer timer = new Timer();
        if (stream.getStreamName().startsWith(READER_GROUP_STREAM_PREFIX)) {
            readerGroupMap.put(stream.getStreamName().substring(READER_GROUP_STREAM_PREFIX.length()), stream.getStreamName());
        }
        log.debug("Processing seal and delete stream for Stream {}", stream);
        Futures.getThrowingException(Futures.exceptionallyExpecting(streamMetadataTasks.sealStream(scopeName, stream.getStreamName(), requestId), e -> {
            Throwable unwrap = Exceptions.unwrap(e);
            // ignore failures if the stream doesn't exist or we are unable to seal it.
            return unwrap instanceof InvalidStreamException || unwrap instanceof ControllerFailureException;
        }, Controller.UpdateStreamStatus.Status.STREAM_NOT_FOUND).thenCompose(sealed -> {
            ControllerService.reportSealStreamMetrics(scopeName, stream.getStreamName(), sealed, timer.getElapsed());
            return CompletableFuture.completedFuture(null);
        }).thenCompose(x -> streamMetadataTasks.deleteStream(stream.getScope(), stream.getStreamName(), requestId).thenCompose(status -> {
            ControllerService.reportDeleteStreamMetrics(scopeName, stream.getStreamName(), status, timer.getElapsed());
            return CompletableFuture.completedFuture(null);
        })));
    }
    // Delete ReaderGroups
    for (Map.Entry<String, String> rgMapEntry : readerGroupMap.entrySet()) {
        log.debug("Processing delete ReaderGroup for {}", rgMapEntry.getKey());
        Timer timer = new Timer();
        Futures.getThrowingException(streamMetadataTasks.getReaderGroupConfig(scopeName, rgMapEntry.getKey(), requestId).thenCompose(conf -> streamMetadataTasks.deleteReaderGroup(scopeName, rgMapEntry.getKey(), conf.getConfig().getReaderGroupId(), requestId).thenCompose(status -> {
            ControllerService.reportDeleteReaderGroupMetrics(scopeName, rgMapEntry.getValue(), status, timer.getElapsed());
            return CompletableFuture.completedFuture(null);
        })));
    }
    // Delete KVTs
    Iterator<KeyValueTableInfo> kvtIterator = listKVTs(scopeName, requestId, context).asIterator();
    while (kvtIterator.hasNext()) {
        String kvt = kvtIterator.next().getKeyValueTableName();
        Timer timer = new Timer();
        log.debug("Processing delete kvt for {}", kvt);
        Futures.getThrowingException(kvtMetadataTasks.deleteKeyValueTable(scopeName, kvt, context.getRequestId()).thenCompose(status -> {
            ControllerService.reportDeleteKVTableMetrics(scopeName, kvt, status, timer.getElapsed());
            return CompletableFuture.completedFuture(null);
        }));
    }
    return streamMetadataStore.deleteScopeRecursive(scopeName, context, executor).thenApply(status -> {
        log.debug("Recursive Delete Scope returned with a status {}", status);
        return null;
    });
}
Also used : OperationContext(io.pravega.controller.store.stream.OperationContext) StreamImpl(io.pravega.client.stream.impl.StreamImpl) Exceptions(io.pravega.common.Exceptions) LoggerFactory(org.slf4j.LoggerFactory) HashMap(java.util.HashMap) CompletableFuture(java.util.concurrent.CompletableFuture) Function(java.util.function.Function) TagLogger(io.pravega.common.tracing.TagLogger) KeyValueTableInfo(io.pravega.client.admin.KeyValueTableInfo) Stream(io.pravega.client.stream.Stream) Map(java.util.Map) ControllerFailureException(io.pravega.client.control.impl.ControllerFailureException) ScheduledExecutorService(java.util.concurrent.ScheduledExecutorService) StreamMetadataTasks(io.pravega.controller.task.Stream.StreamMetadataTasks) Controller(io.pravega.controller.stream.api.grpc.v1.Controller) ControllerService(io.pravega.controller.server.ControllerService) Iterator(java.util.Iterator) Collection(java.util.Collection) DeleteScopeEvent(io.pravega.shared.controller.event.DeleteScopeEvent) AsyncIterator(io.pravega.common.util.AsyncIterator) UUID(java.util.UUID) Timer(io.pravega.common.Timer) TableMetadataTasks(io.pravega.controller.task.KeyValueTable.TableMetadataTasks) Collectors(java.util.stream.Collectors) KVTableMetadataStore(io.pravega.controller.store.kvtable.KVTableMetadataStore) READER_GROUP_STREAM_PREFIX(io.pravega.shared.NameUtils.READER_GROUP_STREAM_PREFIX) AbstractMap(java.util.AbstractMap) List(java.util.List) ContinuationTokenAsyncIterator(io.pravega.common.util.ContinuationTokenAsyncIterator) InvalidStreamException(io.pravega.client.stream.InvalidStreamException) Preconditions(com.google.common.base.Preconditions) StreamMetadataStore(io.pravega.controller.store.stream.StreamMetadataStore) Futures(io.pravega.common.concurrent.Futures) HashMap(java.util.HashMap) InvalidStreamException(io.pravega.client.stream.InvalidStreamException) KeyValueTableInfo(io.pravega.client.admin.KeyValueTableInfo) Timer(io.pravega.common.Timer) ControllerFailureException(io.pravega.client.control.impl.ControllerFailureException) Stream(io.pravega.client.stream.Stream) HashMap(java.util.HashMap) Map(java.util.Map) AbstractMap(java.util.AbstractMap)

Example 52 with OperationContext

use of io.pravega.controller.store.stream.OperationContext in project pravega by pravega.

the class PeriodicWatermarking method watermark.

/**
 * This method computes and emits a new watermark for the given stream.
 * It collects all the known writers for the given stream and includes only writers that are active (have reported
 * their marks recently). If all active writers have reported marks greater than the previously emitted watermark,
 * then new watermark is computed and emitted. If not, the window for considering writers as active is progressed.
 * @param stream stream for which watermark should be computed.
 * @return Returns a completableFuture which when completed will have completed another iteration of periodic watermark
 * computation.
 */
public CompletableFuture<Void> watermark(Stream stream) {
    String scope = stream.getScope();
    String streamName = stream.getStreamName();
    long requestId = requestIdGenerator.get();
    String requestDescriptor = RequestTracker.buildRequestDescriptor("watermark", stream.getScope(), stream.getStreamName());
    requestTracker.trackRequest(requestDescriptor, requestId);
    OperationContext context = streamMetadataStore.createStreamContext(scope, streamName, requestId);
    if (scope.equals(NameUtils.INTERNAL_SCOPE_NAME)) {
        return CompletableFuture.completedFuture(null);
    }
    log.debug(requestId, "Periodic background processing for watermarking called for stream {}/{}", scope, streamName);
    CompletableFuture<Map<String, WriterMark>> allWriterMarks = Futures.exceptionallyExpecting(streamMetadataStore.getAllWriterMarks(scope, streamName, context, executor), e -> Exceptions.unwrap(e) instanceof StoreException.DataNotFoundException, Collections.emptyMap());
    return allWriterMarks.thenCompose(writers -> {
        WatermarkClient watermarkClient = watermarkClientCache.getUnchecked(stream);
        try {
            watermarkClient.reinitialize();
        } catch (Exception e) {
            log.warn(requestId, "Watermarking client for stream {} threw exception {} during reinitialize.", stream, Exceptions.unwrap(e).getClass());
            if (Exceptions.unwrap(e) instanceof NoSuchSegmentException) {
                log.info(requestId, "Invalidating the watermark client in cache for stream {}.", stream);
                watermarkClientCache.invalidate(stream);
            }
            throw e;
        }
        return streamMetadataStore.getConfiguration(scope, streamName, context, executor).thenCompose(config -> filterWritersAndComputeWatermark(scope, streamName, context, watermarkClient, writers, config));
    }).exceptionally(e -> {
        log.warn(requestId, "Exception thrown while trying to perform periodic watermark computation. Logging and ignoring.", e);
        return null;
    });
}
Also used : OperationContext(io.pravega.controller.store.stream.OperationContext) LoadingCache(com.google.common.cache.LoadingCache) StreamSegmentRecord(io.pravega.controller.store.stream.records.StreamSegmentRecord) LoggerFactory(org.slf4j.LoggerFactory) StreamConfiguration(io.pravega.client.stream.StreamConfiguration) ParametersAreNonnullByDefault(javax.annotation.ParametersAreNonnullByDefault) TagLogger(io.pravega.common.tracing.TagLogger) StoreException(io.pravega.controller.store.stream.StoreException) Stream(io.pravega.client.stream.Stream) Map(java.util.Map) SegmentWithRange(io.pravega.shared.watermarks.SegmentWithRange) Synchronized(lombok.Synchronized) WatermarkSerializer(io.pravega.client.watermark.WatermarkSerializer) ImmutableMap(com.google.common.collect.ImmutableMap) NoSuchSegmentException(io.pravega.client.segment.impl.NoSuchSegmentException) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Set(java.util.Set) RequestTracker(io.pravega.common.tracing.RequestTracker) Collectors(java.util.stream.Collectors) CacheLoader(com.google.common.cache.CacheLoader) List(java.util.List) CompletionStage(java.util.concurrent.CompletionStage) Entry(java.util.Map.Entry) CacheBuilder(com.google.common.cache.CacheBuilder) StreamMetadataStore(io.pravega.controller.store.stream.StreamMetadataStore) Futures(io.pravega.common.concurrent.Futures) OperationContext(io.pravega.controller.store.stream.OperationContext) Exceptions(io.pravega.common.Exceptions) AtomicBoolean(java.util.concurrent.atomic.AtomicBoolean) HashMap(java.util.HashMap) CompletableFuture(java.util.concurrent.CompletableFuture) AtomicReference(java.util.concurrent.atomic.AtomicReference) Function(java.util.function.Function) Supplier(java.util.function.Supplier) ArrayList(java.util.ArrayList) BucketStore(io.pravega.controller.store.stream.BucketStore) Lists(com.google.common.collect.Lists) ScheduledExecutorService(java.util.concurrent.ScheduledExecutorService) RevisionedStreamClient(io.pravega.client.state.RevisionedStreamClient) SynchronizerConfig(io.pravega.client.state.SynchronizerConfig) LongSummaryStatistics(java.util.LongSummaryStatistics) NameUtils(io.pravega.shared.NameUtils) WriterMark(io.pravega.controller.store.stream.records.WriterMark) Watermark(io.pravega.shared.watermarks.Watermark) TimeUnit(java.util.concurrent.TimeUnit) EpochRecord(io.pravega.controller.store.stream.records.EpochRecord) SynchronizerClientFactory(io.pravega.client.SynchronizerClientFactory) Closeable(java.io.Closeable) Revision(io.pravega.client.state.Revision) RemovalListener(com.google.common.cache.RemovalListener) VisibleForTesting(com.google.common.annotations.VisibleForTesting) RandomFactory(io.pravega.common.hash.RandomFactory) Comparator(java.util.Comparator) Collections(java.util.Collections) ClientConfig(io.pravega.client.ClientConfig) Map(java.util.Map) ImmutableMap(com.google.common.collect.ImmutableMap) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) HashMap(java.util.HashMap) StoreException(io.pravega.controller.store.stream.StoreException) NoSuchSegmentException(io.pravega.client.segment.impl.NoSuchSegmentException) StoreException(io.pravega.controller.store.stream.StoreException) NoSuchSegmentException(io.pravega.client.segment.impl.NoSuchSegmentException)

Example 53 with OperationContext

use of io.pravega.controller.store.stream.OperationContext in project pravega by pravega.

the class PeriodicWatermarking method computeWatermark.

/**
 * This method takes marks (time + position) of active writers and finds greatest lower bound on time and
 * least upper bound on positions and returns the watermark object composed of the two.
 * The least upper bound computed from positions may not result in a consistent and complete stream cut.
 * So, a positional upper bound is then converted into a stream cut by including segments from higher epoch.
 * Also, it is possible that in an effort to fill missing range, we may end up creating an upper bound that
 * is composed of segments from highest epoch. In next iteration, from new writer positions, we may be able to
 * compute a tighter upper bound. But since watermark has to advance position and time, we will take the upper bound
 * of previous stream cut and new stream cut.
 *
 * @param scope scope
 * @param streamName stream name
 * @param context operation context
 * @param activeWriters marks for all active writers.
 * @param previousWatermark previous watermark that was emitted.
 * @return CompletableFuture which when completed will contain watermark to be emitted.
 */
private CompletableFuture<Watermark> computeWatermark(String scope, String streamName, OperationContext context, List<Map.Entry<String, WriterMark>> activeWriters, Watermark previousWatermark) {
    long requestId = context.getRequestId();
    Watermark.WatermarkBuilder builder = Watermark.builder();
    ConcurrentHashMap<SegmentWithRange, Long> upperBound = new ConcurrentHashMap<>();
    // We are deliberately making two passes over writers - first to find lowest time. Second loop will convert writer
    // positions to StreamSegmentRecord objects by retrieving ranges from store. And then perform computation on those
    // objects.
    LongSummaryStatistics summarized = activeWriters.stream().collect(Collectors.summarizingLong(x -> x.getValue().getTimestamp()));
    long lowerBoundOnTime = summarized.getMin();
    long upperBoundOnTime = summarized.getMax();
    if (lowerBoundOnTime > previousWatermark.getLowerTimeBound()) {
        CompletableFuture<List<Map<SegmentWithRange, Long>>> positionsFuture = Futures.allOfWithResults(activeWriters.stream().map(x -> {
            return Futures.keysAllOfWithResults(x.getValue().getPosition().entrySet().stream().collect(Collectors.toMap(y -> getSegmentWithRange(scope, streamName, context, y.getKey()), Entry::getValue)));
        }).collect(Collectors.toList()));
        log.debug(requestId, "Emitting watermark for stream {}/{} with time {}", scope, streamName, lowerBoundOnTime);
        return positionsFuture.thenAccept(listOfPositions -> listOfPositions.forEach(position -> {
            // add writer positions to upperBound map.
            addToUpperBound(position, upperBound);
        })).thenCompose(v -> computeStreamCut(scope, streamName, context, upperBound, previousWatermark).thenApply(streamCut -> builder.lowerTimeBound(lowerBoundOnTime).upperTimeBound(upperBoundOnTime).streamCut(ImmutableMap.copyOf(streamCut)).build()));
    } else {
        // new time is not advanced. No watermark to be emitted.
        return CompletableFuture.completedFuture(null);
    }
}
Also used : LongSummaryStatistics(java.util.LongSummaryStatistics) LoadingCache(com.google.common.cache.LoadingCache) StreamSegmentRecord(io.pravega.controller.store.stream.records.StreamSegmentRecord) LoggerFactory(org.slf4j.LoggerFactory) StreamConfiguration(io.pravega.client.stream.StreamConfiguration) ParametersAreNonnullByDefault(javax.annotation.ParametersAreNonnullByDefault) TagLogger(io.pravega.common.tracing.TagLogger) StoreException(io.pravega.controller.store.stream.StoreException) Stream(io.pravega.client.stream.Stream) Map(java.util.Map) SegmentWithRange(io.pravega.shared.watermarks.SegmentWithRange) Synchronized(lombok.Synchronized) WatermarkSerializer(io.pravega.client.watermark.WatermarkSerializer) ImmutableMap(com.google.common.collect.ImmutableMap) NoSuchSegmentException(io.pravega.client.segment.impl.NoSuchSegmentException) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Set(java.util.Set) RequestTracker(io.pravega.common.tracing.RequestTracker) Collectors(java.util.stream.Collectors) CacheLoader(com.google.common.cache.CacheLoader) List(java.util.List) CompletionStage(java.util.concurrent.CompletionStage) Entry(java.util.Map.Entry) CacheBuilder(com.google.common.cache.CacheBuilder) StreamMetadataStore(io.pravega.controller.store.stream.StreamMetadataStore) Futures(io.pravega.common.concurrent.Futures) OperationContext(io.pravega.controller.store.stream.OperationContext) Exceptions(io.pravega.common.Exceptions) AtomicBoolean(java.util.concurrent.atomic.AtomicBoolean) HashMap(java.util.HashMap) CompletableFuture(java.util.concurrent.CompletableFuture) AtomicReference(java.util.concurrent.atomic.AtomicReference) Function(java.util.function.Function) Supplier(java.util.function.Supplier) ArrayList(java.util.ArrayList) BucketStore(io.pravega.controller.store.stream.BucketStore) Lists(com.google.common.collect.Lists) ScheduledExecutorService(java.util.concurrent.ScheduledExecutorService) RevisionedStreamClient(io.pravega.client.state.RevisionedStreamClient) SynchronizerConfig(io.pravega.client.state.SynchronizerConfig) LongSummaryStatistics(java.util.LongSummaryStatistics) NameUtils(io.pravega.shared.NameUtils) WriterMark(io.pravega.controller.store.stream.records.WriterMark) Watermark(io.pravega.shared.watermarks.Watermark) TimeUnit(java.util.concurrent.TimeUnit) EpochRecord(io.pravega.controller.store.stream.records.EpochRecord) SynchronizerClientFactory(io.pravega.client.SynchronizerClientFactory) Closeable(java.io.Closeable) Revision(io.pravega.client.state.Revision) RemovalListener(com.google.common.cache.RemovalListener) VisibleForTesting(com.google.common.annotations.VisibleForTesting) RandomFactory(io.pravega.common.hash.RandomFactory) Comparator(java.util.Comparator) Collections(java.util.Collections) ClientConfig(io.pravega.client.ClientConfig) Entry(java.util.Map.Entry) SegmentWithRange(io.pravega.shared.watermarks.SegmentWithRange) List(java.util.List) ArrayList(java.util.ArrayList) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Watermark(io.pravega.shared.watermarks.Watermark)

Example 54 with OperationContext

use of io.pravega.controller.store.stream.OperationContext in project pravega by pravega.

the class PeriodicRetention method retention.

public CompletableFuture<Void> retention(Stream stream) {
    // Track the new request for this automatic truncation.
    long requestId = requestIdGenerator.get();
    String requestDescriptor = RequestTracker.buildRequestDescriptor("truncateStream", stream.getScope(), stream.getStreamName());
    requestTracker.trackRequest(requestDescriptor, requestId);
    OperationContext context = streamMetadataStore.createStreamContext(stream.getScope(), stream.getStreamName(), requestId);
    log.debug(requestId, "Periodic background processing for retention called for stream {}/{}", stream.getScope(), stream.getStreamName());
    return RetryHelper.withRetriesAsync(() -> streamMetadataStore.getConfiguration(stream.getScope(), stream.getStreamName(), context, executor).thenCompose(config -> streamMetadataTasks.retention(stream.getScope(), stream.getStreamName(), config.getRetentionPolicy(), System.currentTimeMillis(), context, this.streamMetadataTasks.retrieveDelegationToken())).exceptionally(e -> {
        log.warn(requestId, "Exception thrown while performing auto retention for stream {} ", stream, e);
        throw new CompletionException(e);
    }), RetryHelper.UNCONDITIONAL_PREDICATE, 5, executor).exceptionally(e -> {
        log.warn(requestId, "Unable to perform retention for stream {}. " + "Ignoring, retention will be attempted in next cycle.", stream, e);
        return null;
    }).thenRun(() -> requestTracker.untrackRequest(requestDescriptor));
}
Also used : OperationContext(io.pravega.controller.store.stream.OperationContext) OperationContext(io.pravega.controller.store.stream.OperationContext) LoggerFactory(org.slf4j.LoggerFactory) CompletableFuture(java.util.concurrent.CompletableFuture) CompletionException(java.util.concurrent.CompletionException) RequestTracker(io.pravega.common.tracing.RequestTracker) Supplier(java.util.function.Supplier) TagLogger(io.pravega.common.tracing.TagLogger) Stream(io.pravega.client.stream.Stream) ScheduledExecutorService(java.util.concurrent.ScheduledExecutorService) StreamMetadataTasks(io.pravega.controller.task.Stream.StreamMetadataTasks) RandomFactory(io.pravega.common.hash.RandomFactory) StreamMetadataStore(io.pravega.controller.store.stream.StreamMetadataStore) RetryHelper(io.pravega.controller.util.RetryHelper) CompletionException(java.util.concurrent.CompletionException)

Example 55 with OperationContext

use of io.pravega.controller.store.stream.OperationContext in project pravega by pravega.

the class ControllerService method getSegmentsAtHead.

public CompletableFuture<Map<SegmentId, Long>> getSegmentsAtHead(final String scope, final String stream, long requestId) {
    Exceptions.checkNotNullOrEmpty(scope, "scope");
    Exceptions.checkNotNullOrEmpty(stream, "stream");
    // First fetch segments active at specified timestamp from the specified stream.
    // Divide current segments in segmentFutures into at most count positions.
    OperationContext context = streamStore.createStreamContext(scope, stream, requestId);
    return streamStore.getSegmentsAtHead(scope, stream, context, executor).thenApply(segments -> {
        return segments.entrySet().stream().collect(Collectors.toMap(entry -> ModelHelper.createSegmentId(scope, stream, entry.getKey().segmentId()), Map.Entry::getValue));
    });
}
Also used : OperationContext(io.pravega.controller.store.stream.OperationContext) StreamSegmentRecord(io.pravega.controller.store.stream.records.StreamSegmentRecord) LoggerFactory(org.slf4j.LoggerFactory) StreamConfiguration(io.pravega.client.stream.StreamConfiguration) KeyValueTableConfiguration(io.pravega.client.tables.KeyValueTableConfiguration) SecureRandom(java.security.SecureRandom) Cluster(io.pravega.common.cluster.Cluster) TagLogger(io.pravega.common.tracing.TagLogger) StoreException(io.pravega.controller.store.stream.StoreException) Pair(org.apache.commons.lang3.tuple.Pair) Duration(java.time.Duration) Map(java.util.Map) SubscribersResponse(io.pravega.controller.stream.api.grpc.v1.Controller.SubscribersResponse) Controller(io.pravega.controller.stream.api.grpc.v1.Controller) ReaderGroupConfig(io.pravega.client.stream.ReaderGroupConfig) DeleteScopeStatus(io.pravega.controller.stream.api.grpc.v1.Controller.DeleteScopeStatus) CreateStreamStatus(io.pravega.controller.stream.api.grpc.v1.Controller.CreateStreamStatus) ImmutableMap(com.google.common.collect.ImmutableMap) CompletionException(java.util.concurrent.CompletionException) RequestTracker(io.pravega.common.tracing.RequestTracker) UUID(java.util.UUID) Collectors(java.util.stream.Collectors) RetriesExhaustedException(io.pravega.common.util.RetriesExhaustedException) KVTableMetadataStore(io.pravega.controller.store.kvtable.KVTableMetadataStore) List(java.util.List) SegmentRecord(io.pravega.controller.store.SegmentRecord) VersionedTransactionData(io.pravega.controller.store.stream.VersionedTransactionData) StreamMetadataStore(io.pravega.controller.store.stream.StreamMetadataStore) PingTxnStatus(io.pravega.controller.stream.api.grpc.v1.Controller.PingTxnStatus) Futures(io.pravega.common.concurrent.Futures) SegmentId(io.pravega.controller.stream.api.grpc.v1.Controller.SegmentId) OperationContext(io.pravega.controller.store.stream.OperationContext) StreamMetrics(io.pravega.controller.metrics.StreamMetrics) CreateReaderGroupResponse(io.pravega.controller.stream.api.grpc.v1.Controller.CreateReaderGroupResponse) TransactionMetrics(io.pravega.controller.metrics.TransactionMetrics) Getter(lombok.Getter) CreateScopeStatus(io.pravega.controller.stream.api.grpc.v1.Controller.CreateScopeStatus) ModelHelper(io.pravega.client.control.impl.ModelHelper) Exceptions(io.pravega.common.Exceptions) KeyValueTableConfigResponse(io.pravega.controller.stream.api.grpc.v1.Controller.KeyValueTableConfigResponse) ScaleStatusResponse(io.pravega.controller.stream.api.grpc.v1.Controller.ScaleStatusResponse) TxnState(io.pravega.controller.stream.api.grpc.v1.Controller.TxnState) CompletableFuture(java.util.concurrent.CompletableFuture) UpdateSubscriberStatus(io.pravega.controller.stream.api.grpc.v1.Controller.UpdateSubscriberStatus) ArrayList(java.util.ArrayList) BucketStore(io.pravega.controller.store.stream.BucketStore) NodeUri(io.pravega.controller.stream.api.grpc.v1.Controller.NodeUri) DeleteKVTableStatus(io.pravega.controller.stream.api.grpc.v1.Controller.DeleteKVTableStatus) ScaleMetadata(io.pravega.controller.store.stream.ScaleMetadata) DeleteStreamStatus(io.pravega.controller.stream.api.grpc.v1.Controller.DeleteStreamStatus) StreamMetadataTasks(io.pravega.controller.task.Stream.StreamMetadataTasks) ScaleResponse(io.pravega.controller.stream.api.grpc.v1.Controller.ScaleResponse) UpdateReaderGroupResponse(io.pravega.controller.stream.api.grpc.v1.Controller.UpdateReaderGroupResponse) NameUtils(io.pravega.shared.NameUtils) Executor(java.util.concurrent.Executor) Timer(io.pravega.common.Timer) SegmentRange(io.pravega.controller.stream.api.grpc.v1.Controller.SegmentRange) TableMetadataTasks(io.pravega.controller.task.KeyValueTable.TableMetadataTasks) ImmutablePair(org.apache.commons.lang3.tuple.ImmutablePair) UpdateStreamStatus(io.pravega.controller.stream.api.grpc.v1.Controller.UpdateStreamStatus) StreamTransactionMetadataTasks(io.pravega.controller.task.Stream.StreamTransactionMetadataTasks) TxnStatus(io.pravega.controller.stream.api.grpc.v1.Controller.TxnStatus) ClusterException(io.pravega.common.cluster.ClusterException) Preconditions(com.google.common.base.Preconditions) State(io.pravega.controller.store.stream.State) DeleteReaderGroupStatus(io.pravega.controller.stream.api.grpc.v1.Controller.DeleteReaderGroupStatus) RandomFactory(io.pravega.common.hash.RandomFactory) CreateKeyValueTableStatus(io.pravega.controller.stream.api.grpc.v1.Controller.CreateKeyValueTableStatus) AllArgsConstructor(lombok.AllArgsConstructor) Comparator(java.util.Comparator) ReaderGroupConfigResponse(io.pravega.controller.stream.api.grpc.v1.Controller.ReaderGroupConfigResponse) Map(java.util.Map) ImmutableMap(com.google.common.collect.ImmutableMap)

Aggregations

OperationContext (io.pravega.controller.store.stream.OperationContext)76 CompletableFuture (java.util.concurrent.CompletableFuture)53 Futures (io.pravega.common.concurrent.Futures)48 StreamMetadataStore (io.pravega.controller.store.stream.StreamMetadataStore)44 ScheduledExecutorService (java.util.concurrent.ScheduledExecutorService)42 Exceptions (io.pravega.common.Exceptions)41 Collectors (java.util.stream.Collectors)41 UUID (java.util.UUID)39 StoreException (io.pravega.controller.store.stream.StoreException)38 List (java.util.List)38 TagLogger (io.pravega.common.tracing.TagLogger)37 LoggerFactory (org.slf4j.LoggerFactory)37 Preconditions (com.google.common.base.Preconditions)36 Map (java.util.Map)32 NameUtils (io.pravega.shared.NameUtils)31 VisibleForTesting (com.google.common.annotations.VisibleForTesting)27 StreamConfiguration (io.pravega.client.stream.StreamConfiguration)26 State (io.pravega.controller.store.stream.State)26 CompletionException (java.util.concurrent.CompletionException)26 BucketStore (io.pravega.controller.store.stream.BucketStore)25