Search in sources :

Example 16 with StoreException

use of io.confluent.kafka.schemaregistry.storage.exceptions.StoreException in project schema-registry by confluentinc.

the class KafkaSchemaRegistry method deleteCompatibilityConfig.

public void deleteCompatibilityConfig(String subject) throws SchemaRegistryStoreException, OperationNotPermittedException {
    if (isReadOnlyMode(subject)) {
        throw new OperationNotPermittedException("Subject " + subject + " is in read-only mode");
    }
    try {
        kafkaStore.waitUntilKafkaReaderReachesLastOffset(subject, kafkaStoreTimeoutMs);
        deleteCompatibility(subject);
    } catch (StoreException e) {
        throw new SchemaRegistryStoreException("Failed to delete subject config value from store", e);
    }
}
Also used : SchemaRegistryStoreException(io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException) OperationNotPermittedException(io.confluent.kafka.schemaregistry.exceptions.OperationNotPermittedException) SchemaRegistryStoreException(io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException) StoreException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreException)

Example 17 with StoreException

use of io.confluent.kafka.schemaregistry.storage.exceptions.StoreException in project schema-registry by confluentinc.

the class KafkaSchemaRegistry method allContexts.

private CloseableIterator<SchemaRegistryValue> allContexts() throws SchemaRegistryException {
    try {
        ContextKey key1 = new ContextKey(tenant(), String.valueOf(Character.MIN_VALUE));
        ContextKey key2 = new ContextKey(tenant(), String.valueOf(Character.MAX_VALUE));
        return kafkaStore.getAll(key1, key2);
    } catch (StoreException e) {
        throw new SchemaRegistryStoreException("Error from the backend Kafka store", e);
    }
}
Also used : SchemaRegistryStoreException(io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException) SchemaRegistryStoreException(io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException) StoreException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreException)

Example 18 with StoreException

use of io.confluent.kafka.schemaregistry.storage.exceptions.StoreException in project schema-registry by confluentinc.

the class KafkaSchemaRegistry method setMode.

public void setMode(String subject, Mode mode, boolean force) throws SchemaRegistryStoreException, OperationNotPermittedException {
    if (!allowModeChanges) {
        throw new OperationNotPermittedException("Mode changes are not allowed");
    }
    ModeKey modeKey = new ModeKey(subject);
    try {
        kafkaStore.waitUntilKafkaReaderReachesLastOffset(subject, kafkaStoreTimeoutMs);
        if (mode == Mode.IMPORT && getMode(subject) != Mode.IMPORT && !force) {
            // Changing to import mode requires that no schemas exist with matching subjects.
            if (hasSubjects(subject, false)) {
                throw new OperationNotPermittedException("Cannot import since found existing subjects");
            }
            // At this point no schemas should exist with matching subjects.
            // Write an event to clear deleted schemas from the caches.
            kafkaStore.put(new ClearSubjectKey(subject), new ClearSubjectValue(subject));
        }
        kafkaStore.put(modeKey, new ModeValue(subject, mode));
        log.debug("Wrote new mode: " + mode.name() + " to the" + " Kafka data store with key " + modeKey.toString());
    } catch (StoreException e) {
        throw new SchemaRegistryStoreException("Failed to write new mode to the store", e);
    }
}
Also used : SchemaRegistryStoreException(io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException) OperationNotPermittedException(io.confluent.kafka.schemaregistry.exceptions.OperationNotPermittedException) SchemaRegistryStoreException(io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException) StoreException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreException)

Example 19 with StoreException

use of io.confluent.kafka.schemaregistry.storage.exceptions.StoreException in project schema-registry by confluentinc.

the class KafkaStore method init.

@Override
public void init() throws StoreInitializationException {
    if (initialized.get()) {
        throw new StoreInitializationException("Illegal state while initializing store. Store was already initialized");
    }
    localStore.init();
    createOrVerifySchemaTopic();
    // set the producer properties and initialize a Kafka producer client
    Properties props = new Properties();
    addSchemaRegistryConfigsToClientProperties(this.config, props);
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapBrokers);
    props.put(ProducerConfig.ACKS_CONFIG, "-1");
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.ByteArraySerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.ByteArraySerializer.class);
    // Producer should not retry
    props.put(ProducerConfig.RETRIES_CONFIG, 0);
    props.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, false);
    producer = new KafkaProducer<byte[], byte[]>(props);
    // start the background thread that subscribes to the Kafka topic and applies updates.
    // the thread must be created after the schema topic has been created.
    this.kafkaTopicReader = new KafkaStoreReaderThread<>(this.bootstrapBrokers, topic, groupId, this.storeUpdateHandler, serializer, this.localStore, this.producer, this.noopKey, this.initialized, this.config);
    this.kafkaTopicReader.start();
    try {
        waitUntilKafkaReaderReachesLastOffset(initTimeout);
    } catch (StoreException e) {
        throw new StoreInitializationException(e);
    }
    boolean isInitialized = initialized.compareAndSet(false, true);
    if (!isInitialized) {
        throw new StoreInitializationException("Illegal state while initializing store. Store " + "was already initialized");
    }
    this.storeUpdateHandler.cacheInitialized(new HashMap<>(kafkaTopicReader.checkpoints()));
    initLatch.countDown();
}
Also used : StoreInitializationException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreInitializationException) Properties(java.util.Properties) StoreException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreException)

Example 20 with StoreException

use of io.confluent.kafka.schemaregistry.storage.exceptions.StoreException in project schema-registry by confluentinc.

the class KafkaStore method put.

@Override
public V put(K key, V value) throws StoreTimeoutException, StoreException {
    assertInitialized();
    if (key == null) {
        throw new StoreException("Key should not be null");
    }
    V oldValue = get(key);
    // write to the Kafka topic
    ProducerRecord<byte[], byte[]> producerRecord = null;
    try {
        producerRecord = new ProducerRecord<byte[], byte[]>(topic, 0, this.serializer.serializeKey(key), value == null ? null : this.serializer.serializeValue(value));
    } catch (SerializationException e) {
        throw new StoreException("Error serializing schema while creating the Kafka produce " + "record", e);
    }
    boolean knownSuccessfulWrite = false;
    try {
        log.trace("Sending record to KafkaStore topic: " + producerRecord);
        Future<RecordMetadata> ack = producer.send(producerRecord);
        RecordMetadata recordMetadata = ack.get(timeout, TimeUnit.MILLISECONDS);
        log.trace("Waiting for the local store to catch up to offset " + recordMetadata.offset());
        this.lastWrittenOffset = recordMetadata.offset();
        if (key instanceof SubjectKey) {
            setLastOffset(((SubjectKey) key).getSubject(), recordMetadata.offset());
        }
        waitUntilKafkaReaderReachesOffset(recordMetadata.offset(), timeout);
        knownSuccessfulWrite = true;
    } catch (InterruptedException e) {
        throw new StoreException("Put operation interrupted while waiting for an ack from Kafka", e);
    } catch (ExecutionException e) {
        throw new StoreException("Put operation failed while waiting for an ack from Kafka", e);
    } catch (TimeoutException e) {
        throw new StoreTimeoutException("Put operation timed out while waiting for an ack from Kafka", e);
    } catch (KafkaException ke) {
        throw new StoreException("Put operation to Kafka failed", ke);
    } finally {
        if (!knownSuccessfulWrite) {
            markLastWrittenOffsetInvalid();
        }
    }
    return oldValue;
}
Also used : SerializationException(io.confluent.kafka.schemaregistry.storage.exceptions.SerializationException) StoreException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreException) RecordMetadata(org.apache.kafka.clients.producer.RecordMetadata) StoreTimeoutException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreTimeoutException) KafkaException(org.apache.kafka.common.KafkaException) ExecutionException(java.util.concurrent.ExecutionException) TimeoutException(java.util.concurrent.TimeoutException) StoreTimeoutException(io.confluent.kafka.schemaregistry.storage.exceptions.StoreTimeoutException)

Aggregations

StoreException (io.confluent.kafka.schemaregistry.storage.exceptions.StoreException)22 SchemaRegistryStoreException (io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryStoreException)16 OperationNotPermittedException (io.confluent.kafka.schemaregistry.exceptions.OperationNotPermittedException)7 StoreTimeoutException (io.confluent.kafka.schemaregistry.storage.exceptions.StoreTimeoutException)7 ParsedSchema (io.confluent.kafka.schemaregistry.ParsedSchema)5 SchemaString (io.confluent.kafka.schemaregistry.client.rest.entities.SchemaString)4 SchemaRegistryTimeoutException (io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryTimeoutException)4 ArrayList (java.util.ArrayList)4 AvroSchema (io.confluent.kafka.schemaregistry.avro.AvroSchema)3 Schema (io.confluent.kafka.schemaregistry.client.rest.entities.Schema)3 ReferenceExistsException (io.confluent.kafka.schemaregistry.exceptions.ReferenceExistsException)3 StoreInitializationException (io.confluent.kafka.schemaregistry.storage.exceptions.StoreInitializationException)3 ExecutionException (java.util.concurrent.ExecutionException)3 TimeoutException (java.util.concurrent.TimeoutException)3 SchemaProvider (io.confluent.kafka.schemaregistry.SchemaProvider)2 RestService (io.confluent.kafka.schemaregistry.client.rest.RestService)2 IdDoesNotMatchException (io.confluent.kafka.schemaregistry.exceptions.IdDoesNotMatchException)2 IncompatibleSchemaException (io.confluent.kafka.schemaregistry.exceptions.IncompatibleSchemaException)2 SchemaRegistryException (io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryException)2 SchemaVersionNotSoftDeletedException (io.confluent.kafka.schemaregistry.exceptions.SchemaVersionNotSoftDeletedException)2