Search in sources :

Example 1 with OkapiConnectionParams

use of org.folio.dataimport.util.OkapiConnectionParams in project mod-source-record-storage by folio-org.

the class DataImportKafkaHandler method handle.

@Override
public Future<String> handle(KafkaConsumerRecord<String, String> targetRecord) {
    String recordId = extractValueFromHeaders(targetRecord.headers(), RECORD_ID_HEADER);
    String chunkId = extractValueFromHeaders(targetRecord.headers(), CHUNK_ID_HEADER);
    try {
        Promise<String> promise = Promise.promise();
        Event event = ObjectMapperTool.getMapper().readValue(targetRecord.value(), Event.class);
        DataImportEventPayload eventPayload = Json.decodeValue(event.getEventPayload(), DataImportEventPayload.class);
        LOGGER.debug("Data import event payload has been received with event type: '{}' by jobExecutionId: '{}' and recordId: '{}' and chunkId: '{}'", eventPayload.getEventType(), eventPayload.getJobExecutionId(), recordId, chunkId);
        eventPayload.getContext().put(RECORD_ID_HEADER, recordId);
        eventPayload.getContext().put(CHUNK_ID_HEADER, chunkId);
        OkapiConnectionParams params = RestUtil.retrieveOkapiConnectionParams(eventPayload, vertx);
        String jobProfileSnapshotId = eventPayload.getContext().get(PROFILE_SNAPSHOT_ID_KEY);
        profileSnapshotCache.get(jobProfileSnapshotId, params).toCompletionStage().thenCompose(snapshotOptional -> snapshotOptional.map(profileSnapshot -> EventManager.handleEvent(eventPayload, profileSnapshot)).orElse(CompletableFuture.failedFuture(new EventProcessingException(format("Job profile snapshot with id '%s' does not exist", jobProfileSnapshotId))))).whenComplete((processedPayload, throwable) -> {
            if (throwable != null) {
                promise.fail(throwable);
            } else if (DI_ERROR.value().equals(processedPayload.getEventType())) {
                promise.fail(format("Failed to process data import event payload from topic '%s' by jobExecutionId: '%s' with recordId: '%s' and chunkId: '%s' ", targetRecord.topic(), eventPayload.getJobExecutionId(), recordId, chunkId));
            } else {
                promise.complete(targetRecord.key());
            }
        });
        return promise.future();
    } catch (Exception e) {
        LOGGER.error("Failed to process data import kafka record from topic '{}' with recordId: '{}' and chunkId: '{}' ", targetRecord.topic(), recordId, chunkId, e);
        return Future.failedFuture(e);
    }
}
Also used : Event(org.folio.rest.jaxrs.model.Event) Json(io.vertx.core.json.Json) RestUtil(org.folio.services.util.RestUtil) Promise(io.vertx.core.Promise) Vertx(io.vertx.core.Vertx) DataImportEventPayload(org.folio.DataImportEventPayload) Autowired(org.springframework.beans.factory.annotation.Autowired) CompletableFuture(java.util.concurrent.CompletableFuture) AsyncRecordHandler(org.folio.kafka.AsyncRecordHandler) Future(io.vertx.core.Future) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) String.format(java.lang.String.format) DI_ERROR(org.folio.DataImportEventTypes.DI_ERROR) Component(org.springframework.stereotype.Component) List(java.util.List) Logger(org.apache.logging.log4j.Logger) EventProcessingException(org.folio.processing.exceptions.EventProcessingException) KafkaConsumerRecord(io.vertx.kafka.client.consumer.KafkaConsumerRecord) ObjectMapperTool(org.folio.dbschema.ObjectMapperTool) JobProfileSnapshotCache(org.folio.services.caches.JobProfileSnapshotCache) Qualifier(org.springframework.beans.factory.annotation.Qualifier) KafkaHeader(io.vertx.kafka.client.producer.KafkaHeader) LogManager(org.apache.logging.log4j.LogManager) EventManager(org.folio.processing.events.EventManager) Event(org.folio.rest.jaxrs.model.Event) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) EventProcessingException(org.folio.processing.exceptions.EventProcessingException) DataImportEventPayload(org.folio.DataImportEventPayload) EventProcessingException(org.folio.processing.exceptions.EventProcessingException)

Example 2 with OkapiConnectionParams

use of org.folio.dataimport.util.OkapiConnectionParams in project mod-source-record-storage by folio-org.

the class ParsedRecordChunksKafkaHandler method handle.

@Override
public Future<String> handle(KafkaConsumerRecord<String, String> targetRecord) {
    Event event = Json.decodeValue(targetRecord.value(), Event.class);
    RecordCollection recordCollection = Json.decodeValue(event.getEventPayload(), RecordCollection.class);
    List<KafkaHeader> kafkaHeaders = targetRecord.headers();
    OkapiConnectionParams okapiConnectionParams = new OkapiConnectionParams(KafkaHeaderUtils.kafkaHeadersToMap(kafkaHeaders), vertx);
    String tenantId = okapiConnectionParams.getTenantId();
    String recordId = extractValueFromHeaders(targetRecord.headers(), RECORD_ID_HEADER);
    String chunkId = extractValueFromHeaders(targetRecord.headers(), CHUNK_ID_HEADER);
    String key = targetRecord.key();
    int chunkNumber = chunkCounter.incrementAndGet();
    DataImportEventPayload eventPayload = Json.decodeValue(event.getEventPayload(), DataImportEventPayload.class);
    try {
        LOGGER.debug("RecordCollection has been received with event: '{}', chunkId: '{}', starting processing... chunkNumber '{}'-'{}' with recordId: '{}'' ", eventPayload.getEventType(), chunkId, chunkNumber, key, recordId);
        return recordService.saveRecords(recordCollection, tenantId).compose(recordsBatchResponse -> sendBackRecordsBatchResponse(recordsBatchResponse, kafkaHeaders, tenantId, chunkNumber, eventPayload.getEventType(), targetRecord));
    } catch (Exception e) {
        LOGGER.error("RecordCollection processing has failed with errors with event: '{}', chunkId: '{}', chunkNumber '{}'-'{}' with recordId: '{}' ", eventPayload.getEventType(), chunkId, chunkNumber, key, recordId);
        return Future.failedFuture(e);
    }
}
Also used : RecordCollection(org.folio.rest.jaxrs.model.RecordCollection) Event(org.folio.rest.jaxrs.model.Event) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) KafkaHeader(io.vertx.kafka.client.producer.KafkaHeader) DataImportEventPayload(org.folio.DataImportEventPayload)

Example 3 with OkapiConnectionParams

use of org.folio.dataimport.util.OkapiConnectionParams in project mod-source-record-manager by folio-org.

the class EventDrivenChunkProcessingServiceImpl method processRawRecordsChunk.

@Override
protected Future<Boolean> processRawRecordsChunk(RawRecordsDto incomingChunk, JobExecutionSourceChunk sourceChunk, String jobExecutionId, OkapiConnectionParams params) {
    LOGGER.debug("Starting to process raw records chunk with id: {} for jobExecutionId: {}. Chunk size: {}.", sourceChunk.getId(), jobExecutionId, sourceChunk.getChunkSize());
    Promise<Boolean> promise = Promise.promise();
    initializeJobExecutionProgressIfNecessary(jobExecutionId, incomingChunk, params.getTenantId()).compose(ar -> checkAndUpdateJobExecutionStatusIfNecessary(jobExecutionId, new StatusDto().withStatus(StatusDto.Status.PARSING_IN_PROGRESS), params)).compose(jobExec -> changeEngineService.parseRawRecordsChunkForJobExecution(incomingChunk, jobExec, sourceChunk.getId(), params)).onComplete(sendEventsAr -> updateJobExecutionIfAllSourceChunksMarkedAsError(jobExecutionId, params).onComplete(updateAr -> promise.handle(sendEventsAr.map(true))));
    return promise.future();
}
Also used : StatusDto(org.folio.rest.jaxrs.model.StatusDto) StatusDto(org.folio.rest.jaxrs.model.StatusDto) Promise(io.vertx.core.Promise) Autowired(org.springframework.beans.factory.annotation.Autowired) JobExecutionSourceChunk(org.folio.rest.jaxrs.model.JobExecutionSourceChunk) Future(io.vertx.core.Future) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) NotFoundException(javax.ws.rs.NotFoundException) RawRecordsDto(org.folio.rest.jaxrs.model.RawRecordsDto) Logger(org.apache.logging.log4j.Logger) JobExecutionProgress(org.folio.rest.jaxrs.model.JobExecutionProgress) JobExecutionProgressService(org.folio.services.progress.JobExecutionProgressService) CollectionUtils.isNotEmpty(org.apache.commons.collections4.CollectionUtils.isNotEmpty) PARSING_IN_PROGRESS(org.folio.rest.jaxrs.model.StatusDto.Status.PARSING_IN_PROGRESS) Service(org.springframework.stereotype.Service) JobExecutionSourceChunkDao(org.folio.dao.JobExecutionSourceChunkDao) JobExecution(org.folio.rest.jaxrs.model.JobExecution) LogManager(org.apache.logging.log4j.LogManager)

Example 4 with OkapiConnectionParams

use of org.folio.dataimport.util.OkapiConnectionParams in project mod-source-record-manager by folio-org.

the class JobExecutionServiceImpl method updateSnapshotStatus.

private Future<JobExecution> updateSnapshotStatus(JobExecution jobExecution, OkapiConnectionParams params) {
    Promise<JobExecution> promise = Promise.promise();
    Snapshot snapshot = new Snapshot().withJobExecutionId(jobExecution.getId()).withStatus(Snapshot.Status.fromValue(jobExecution.getStatus().name()));
    SourceStorageSnapshotsClient client = new SourceStorageSnapshotsClient(params.getOkapiUrl(), params.getTenantId(), params.getToken());
    try {
        client.putSourceStorageSnapshotsByJobExecutionId(jobExecution.getId(), null, snapshot, response -> {
            if (response.result().statusCode() == HttpStatus.HTTP_OK.toInt()) {
                promise.complete(jobExecution);
            } else {
                jobExecutionDao.updateBlocking(jobExecution.getId(), jobExec -> {
                    Promise<JobExecution> jobExecutionPromise = Promise.promise();
                    jobExec.setErrorStatus(JobExecution.ErrorStatus.SNAPSHOT_UPDATE_ERROR);
                    jobExec.setStatus(JobExecution.Status.ERROR);
                    jobExec.setUiStatus(JobExecution.UiStatus.ERROR);
                    jobExec.setCompletedDate(new Date());
                    jobExecutionPromise.complete(jobExec);
                    return jobExecutionPromise.future();
                }, params.getTenantId()).onComplete(jobExecutionUpdate -> {
                    String message = "Couldn't update snapshot status for jobExecution with id " + jobExecution.getId();
                    LOGGER.error(message);
                    promise.fail(message);
                });
            }
        });
    } catch (Exception e) {
        LOGGER.error("Error during update for Snapshot with id {}", jobExecution.getId(), e);
        promise.fail(e);
    }
    return promise.future();
}
Also used : JobExecution(org.folio.rest.jaxrs.model.JobExecution) DataImportProfilesClient(org.folio.rest.client.DataImportProfilesClient) Arrays(java.util.Arrays) Date(java.util.Date) Autowired(org.springframework.beans.factory.annotation.Autowired) UserInfo(org.folio.rest.jaxrs.model.UserInfo) ProfileSnapshotWrapper(org.folio.rest.jaxrs.model.ProfileSnapshotWrapper) JobDuplicateUpdateException(org.folio.services.exceptions.JobDuplicateUpdateException) JobExecutionSourceChunkDao(org.folio.dao.JobExecutionSourceChunkDao) JobExecution(org.folio.rest.jaxrs.model.JobExecution) JobProfileInfo(org.folio.rest.jaxrs.model.JobProfileInfo) InitJobExecutionsRsDto(org.folio.rest.jaxrs.model.InitJobExecutionsRsDto) ERROR(org.folio.rest.jaxrs.model.StatusDto.Status.ERROR) JsonObject(io.vertx.core.json.JsonObject) BadRequestException(javax.ws.rs.BadRequestException) HttpException(io.vertx.ext.web.handler.HttpException) GenericCompositeFuture(org.folio.okapi.common.GenericCompositeFuture) StatusDto(org.folio.rest.jaxrs.model.StatusDto) UUID(java.util.UUID) Future(io.vertx.core.Future) Try(org.folio.dataimport.util.Try) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) NotFoundException(javax.ws.rs.NotFoundException) String.format(java.lang.String.format) Objects(java.util.Objects) List(java.util.List) HTTP_OK(org.folio.HttpStatus.HTTP_OK) Logger(org.apache.logging.log4j.Logger) JobExecutionFilter(org.folio.dao.JobExecutionFilter) JobExecutionDtoCollection(org.folio.rest.jaxrs.model.JobExecutionDtoCollection) HTTP_CREATED(org.folio.HttpStatus.HTTP_CREATED) Optional(java.util.Optional) RestUtil(org.folio.dataimport.util.RestUtil) JobProfile(org.folio.rest.jaxrs.model.JobProfile) FilenameUtils(org.apache.commons.io.FilenameUtils) Progress(org.folio.rest.jaxrs.model.Progress) JobExecutionDao(org.folio.dao.JobExecutionDao) CANCELLED(org.folio.rest.jaxrs.model.StatusDto.Status.CANCELLED) DeleteJobExecutionsResp(org.folio.rest.jaxrs.model.DeleteJobExecutionsResp) ArrayList(java.util.ArrayList) SourceStorageSnapshotsClient(org.folio.rest.client.SourceStorageSnapshotsClient) Service(org.springframework.stereotype.Service) InitJobExecutionsRqDto(org.folio.rest.jaxrs.model.InitJobExecutionsRqDto) Promise(io.vertx.core.Promise) COMMITTED(org.folio.rest.jaxrs.model.JobExecution.Status.COMMITTED) RunBy(org.folio.rest.jaxrs.model.RunBy) File(org.folio.rest.jaxrs.model.File) SortField(org.folio.dao.util.SortField) HttpMethod(io.vertx.core.http.HttpMethod) JobExecutionUserInfoCollection(org.folio.rest.jaxrs.model.JobExecutionUserInfoCollection) JobProfileInfoCollection(org.folio.rest.jaxrs.model.JobProfileInfoCollection) HttpStatus(org.folio.HttpStatus) PROFILE_SNAPSHOT_CREATING_ERROR(org.folio.rest.jaxrs.model.StatusDto.ErrorStatus.PROFILE_SNAPSHOT_CREATING_ERROR) LogManager(org.apache.logging.log4j.LogManager) Snapshot(org.folio.rest.jaxrs.model.Snapshot) Collections(java.util.Collections) Snapshot(org.folio.rest.jaxrs.model.Snapshot) Promise(io.vertx.core.Promise) SourceStorageSnapshotsClient(org.folio.rest.client.SourceStorageSnapshotsClient) Date(java.util.Date) JobDuplicateUpdateException(org.folio.services.exceptions.JobDuplicateUpdateException) BadRequestException(javax.ws.rs.BadRequestException) HttpException(io.vertx.ext.web.handler.HttpException) NotFoundException(javax.ws.rs.NotFoundException)

Example 5 with OkapiConnectionParams

use of org.folio.dataimport.util.OkapiConnectionParams in project mod-source-record-manager by folio-org.

the class DataImportKafkaHandler method handle.

@Override
public Future<String> handle(KafkaConsumerRecord<String, String> record) {
    try {
        Promise<String> result = Promise.promise();
        List<KafkaHeader> kafkaHeaders = record.headers();
        OkapiConnectionParams okapiConnectionParams = new OkapiConnectionParams(KafkaHeaderUtils.kafkaHeadersToMap(kafkaHeaders), vertx);
        String recordId = okapiConnectionParams.getHeaders().get(RECORD_ID_HEADER);
        Event event = Json.decodeValue(record.value(), Event.class);
        String jobExecutionId = extractJobExecutionId(kafkaHeaders);
        LOGGER.info("Event was received with recordId: '{}' event type: '{}' with jobExecutionId: '{}'", recordId, event.getEventType(), jobExecutionId);
        if (StringUtils.isBlank(recordId)) {
            handleLocalEvent(result, okapiConnectionParams, event);
            return result.future();
        }
        eventProcessedService.collectData(DATA_IMPORT_KAFKA_HANDLER_UUID, event.getId(), okapiConnectionParams.getTenantId()).onSuccess(res -> handleLocalEvent(result, okapiConnectionParams, event)).onFailure(e -> {
            if (e instanceof DuplicateEventException) {
                LOGGER.info(e.getMessage());
                result.complete();
            } else {
                LOGGER.error("Error with database during collecting of deduplication info for handlerId: {} , eventId: {}. ", DATA_IMPORT_KAFKA_HANDLER_UUID, event.getId(), e);
                result.fail(e);
            }
        });
        return result.future();
    } catch (Exception e) {
        LOGGER.error("Error during processing data-import result", e);
        return Future.failedFuture(e);
    }
}
Also used : Event(org.folio.rest.jaxrs.model.Event) Json(io.vertx.core.json.Json) DuplicateEventException(org.folio.kafka.exception.DuplicateEventException) Promise(io.vertx.core.Promise) Vertx(io.vertx.core.Vertx) Autowired(org.springframework.beans.factory.annotation.Autowired) EventHandlingService(org.folio.services.EventHandlingService) AsyncRecordHandler(org.folio.kafka.AsyncRecordHandler) Future(io.vertx.core.Future) StringUtils(org.apache.commons.lang3.StringUtils) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) Component(org.springframework.stereotype.Component) List(java.util.List) Logger(org.apache.logging.log4j.Logger) KafkaConsumerRecord(io.vertx.kafka.client.consumer.KafkaConsumerRecord) Qualifier(org.springframework.beans.factory.annotation.Qualifier) EventProcessedService(org.folio.services.EventProcessedService) KafkaHeader(io.vertx.kafka.client.producer.KafkaHeader) LogManager(org.apache.logging.log4j.LogManager) KafkaHeaderUtils(org.folio.kafka.KafkaHeaderUtils) RawRecordsFlowControlService(org.folio.services.flowcontrol.RawRecordsFlowControlService) DuplicateEventException(org.folio.kafka.exception.DuplicateEventException) Event(org.folio.rest.jaxrs.model.Event) OkapiConnectionParams(org.folio.dataimport.util.OkapiConnectionParams) DuplicateEventException(org.folio.kafka.exception.DuplicateEventException) KafkaHeader(io.vertx.kafka.client.producer.KafkaHeader)

Aggregations

OkapiConnectionParams (org.folio.dataimport.util.OkapiConnectionParams)37 Future (io.vertx.core.Future)17 LogManager (org.apache.logging.log4j.LogManager)15 Logger (org.apache.logging.log4j.Logger)15 Autowired (org.springframework.beans.factory.annotation.Autowired)15 KafkaHeader (io.vertx.kafka.client.producer.KafkaHeader)14 HashMap (java.util.HashMap)14 NotFoundException (javax.ws.rs.NotFoundException)14 Promise (io.vertx.core.Promise)13 JsonObject (io.vertx.core.json.JsonObject)13 String.format (java.lang.String.format)12 List (java.util.List)12 JobExecution (org.folio.rest.jaxrs.model.JobExecution)11 Json (io.vertx.core.json.Json)10 Event (org.folio.rest.jaxrs.model.Event)10 StatusDto (org.folio.rest.jaxrs.model.StatusDto)10 Service (org.springframework.stereotype.Service)10 Vertx (io.vertx.core.Vertx)9 KafkaHeaderUtils (org.folio.kafka.KafkaHeaderUtils)9 RawRecordsDto (org.folio.rest.jaxrs.model.RawRecordsDto)9