Search in sources :

Example 11 with InputConfig

use of com.google.cloud.datalabeling.v1beta1.InputConfig in project java-automl by googleapis.

the class ImportDataset method importDataset.

// Import a dataset
static void importDataset(String projectId, String datasetId, String path) throws IOException, ExecutionException, InterruptedException, TimeoutException {
    Duration totalTimeout = Duration.ofMinutes(45);
    RetrySettings retrySettings = RetrySettings.newBuilder().setTotalTimeout(totalTimeout).build();
    AutoMlSettings.Builder builder = AutoMlSettings.newBuilder();
    builder.importDataSettings().setRetrySettings(retrySettings).build();
    AutoMlSettings settings = builder.build();
    // the "close" method on the client to safely clean up any remaining background resources.
    try (AutoMlClient client = AutoMlClient.create(settings)) {
        // Get the complete path of the dataset.
        DatasetName datasetFullId = DatasetName.of(projectId, "us-central1", datasetId);
        // Get multiple Google Cloud Storage URIs to import data from
        GcsSource gcsSource = GcsSource.newBuilder().addAllInputUris(Arrays.asList(path.split(","))).build();
        // Import data from the input URI
        InputConfig inputConfig = InputConfig.newBuilder().setGcsSource(gcsSource).build();
        System.out.println("Processing import...");
        // Start the import job
        OperationFuture<Empty, OperationMetadata> operation = client.importDataAsync(datasetFullId, inputConfig);
        System.out.format("Operation name: %s%n", operation.getName());
        // If you want to wait for the operation to finish, adjust the timeout appropriately. The
        // operation will still run if you choose not to wait for it to complete. You can check the
        // status of your operation using the operation's name.
        Empty response = operation.get(45, TimeUnit.MINUTES);
        System.out.format("Dataset imported. %s%n", response);
    } catch (TimeoutException e) {
        System.out.println("The operation's polling period was not long enough.");
        System.out.println("You can use the Operation's name to get the current status.");
        System.out.println("The import job is still running and will complete as expected.");
        throw e;
    }
}
Also used : RetrySettings(com.google.api.gax.retrying.RetrySettings) Empty(com.google.protobuf.Empty) GcsSource(com.google.cloud.automl.v1beta1.GcsSource) DatasetName(com.google.cloud.automl.v1beta1.DatasetName) Duration(org.threeten.bp.Duration) InputConfig(com.google.cloud.automl.v1beta1.InputConfig) AutoMlSettings(com.google.cloud.automl.v1beta1.AutoMlSettings) OperationMetadata(com.google.cloud.automl.v1beta1.OperationMetadata) AutoMlClient(com.google.cloud.automl.v1beta1.AutoMlClient) TimeoutException(java.util.concurrent.TimeoutException)

Example 12 with InputConfig

use of com.google.cloud.datalabeling.v1beta1.InputConfig in project java-automl by googleapis.

the class ImportDataset method importDataset.

// Import a dataset
static void importDataset(String projectId, String datasetId, String path) throws IOException, ExecutionException, InterruptedException, TimeoutException {
    // the "close" method on the client to safely clean up any remaining background resources.
    try (AutoMlClient client = AutoMlClient.create()) {
        // Get the complete path of the dataset.
        DatasetName datasetFullId = DatasetName.of(projectId, "us-central1", datasetId);
        // Get multiple Google Cloud Storage URIs to import data from
        GcsSource gcsSource = GcsSource.newBuilder().addAllInputUris(Arrays.asList(path.split(","))).build();
        // Import data from the input URI
        InputConfig inputConfig = InputConfig.newBuilder().setGcsSource(gcsSource).build();
        System.out.println("Processing import...");
        // Start the import job
        OperationFuture<Empty, OperationMetadata> operation = client.importDataAsync(datasetFullId, inputConfig);
        System.out.format("Operation name: %s%n", operation.getName());
        // If you want to wait for the operation to finish, adjust the timeout appropriately. The
        // operation will still run if you choose not to wait for it to complete. You can check the
        // status of your operation using the operation's name.
        Empty response = operation.get(45, TimeUnit.MINUTES);
        System.out.format("Dataset imported. %s%n", response);
    } catch (TimeoutException e) {
        System.out.println("The operation's polling period was not long enough.");
        System.out.println("You can use the Operation's name to get the current status.");
        System.out.println("The import job is still running and will complete as expected.");
        throw e;
    }
}
Also used : Empty(com.google.protobuf.Empty) GcsSource(com.google.cloud.automl.v1.GcsSource) DatasetName(com.google.cloud.automl.v1.DatasetName) InputConfig(com.google.cloud.automl.v1.InputConfig) OperationMetadata(com.google.cloud.automl.v1.OperationMetadata) AutoMlClient(com.google.cloud.automl.v1.AutoMlClient) TimeoutException(java.util.concurrent.TimeoutException)

Example 13 with InputConfig

use of com.google.cloud.datalabeling.v1beta1.InputConfig in project spring-cloud-gcp by GoogleCloudPlatform.

the class CloudVisionTemplate method analyzeFile.

/**
 * Analyze a file and extract the features of the image specified by {@code featureTypes}.
 *
 * <p>A feature describes the kind of Cloud Vision analysis one wishes to perform on a file, such
 * as text detection, image labelling, facial detection, etc. A full list of feature types can be
 * found in {@link Feature.Type}.
 *
 * @param fileResource the file one wishes to analyze. The Cloud Vision APIs support image formats
 *     described here: https://cloud.google.com/vision/docs/supported-files. Documents with more
 *     than 5 pages are not supported.
 * @param mimeType the mime type of the fileResource. Currently, only "application/pdf",
 *     "image/tiff" and "image/gif" are supported.
 * @param featureTypes the types of image analysis to perform on the image
 * @return the results of file analyse
 * @throws CloudVisionException if the file could not be read or if a malformed response is
 *     received from the Cloud Vision APIs
 */
public AnnotateFileResponse analyzeFile(Resource fileResource, String mimeType, Feature.Type... featureTypes) {
    ByteString imgBytes;
    try {
        imgBytes = ByteString.readFrom(fileResource.getInputStream());
    } catch (IOException ex) {
        throw new CloudVisionException(READ_BYTES_ERROR_MESSAGE, ex);
    }
    InputConfig inputConfig = InputConfig.newBuilder().setMimeType(mimeType).setContent(imgBytes).build();
    List<Feature> featureList = Arrays.stream(featureTypes).map(featureType -> Feature.newBuilder().setType(featureType).build()).collect(Collectors.toList());
    BatchAnnotateFilesRequest request = BatchAnnotateFilesRequest.newBuilder().addRequests(AnnotateFileRequest.newBuilder().addAllFeatures(featureList).setInputConfig(inputConfig).build()).build();
    BatchAnnotateFilesResponse response = this.imageAnnotatorClient.batchAnnotateFiles(request);
    List<AnnotateFileResponse> annotateFileResponses = response.getResponsesList();
    if (!annotateFileResponses.isEmpty()) {
        return annotateFileResponses.get(0);
    } else {
        throw new CloudVisionException(EMPTY_RESPONSE_ERROR_MESSAGE);
    }
}
Also used : AnnotateFileRequest(com.google.cloud.vision.v1.AnnotateFileRequest) Arrays(java.util.Arrays) AnnotateImageResponse(com.google.cloud.vision.v1.AnnotateImageResponse) Type(com.google.cloud.vision.v1.Feature.Type) BatchAnnotateFilesRequest(com.google.cloud.vision.v1.BatchAnnotateFilesRequest) IOException(java.io.IOException) InputConfig(com.google.cloud.vision.v1.InputConfig) Collectors(java.util.stream.Collectors) Feature(com.google.cloud.vision.v1.Feature) ByteString(com.google.protobuf.ByteString) List(java.util.List) AnnotateFileResponse(com.google.cloud.vision.v1.AnnotateFileResponse) BatchAnnotateFilesResponse(com.google.cloud.vision.v1.BatchAnnotateFilesResponse) Image(com.google.cloud.vision.v1.Image) ImageAnnotatorClient(com.google.cloud.vision.v1.ImageAnnotatorClient) ImageContext(com.google.cloud.vision.v1.ImageContext) AnnotateImageRequest(com.google.cloud.vision.v1.AnnotateImageRequest) BatchAnnotateImagesResponse(com.google.cloud.vision.v1.BatchAnnotateImagesResponse) BatchAnnotateImagesRequest(com.google.cloud.vision.v1.BatchAnnotateImagesRequest) Code(com.google.rpc.Code) Resource(org.springframework.core.io.Resource) Assert(org.springframework.util.Assert) BatchAnnotateFilesResponse(com.google.cloud.vision.v1.BatchAnnotateFilesResponse) BatchAnnotateFilesRequest(com.google.cloud.vision.v1.BatchAnnotateFilesRequest) ByteString(com.google.protobuf.ByteString) AnnotateFileResponse(com.google.cloud.vision.v1.AnnotateFileResponse) InputConfig(com.google.cloud.vision.v1.InputConfig) IOException(java.io.IOException) Feature(com.google.cloud.vision.v1.Feature)

Example 14 with InputConfig

use of com.google.cloud.datalabeling.v1beta1.InputConfig in project java-automl by googleapis.

the class DatasetApi method importData.

// [START automl_translate_import_data]
/**
 * Import sentence pairs to the dataset.
 *
 * @param projectId the Google Cloud Project ID.
 * @param computeRegion the Region name. (e.g., "us-central1").
 * @param datasetId the Id of the dataset.
 * @param path the remote Path of the training data csv file.
 */
public static void importData(String projectId, String computeRegion, String datasetId, String path) throws IOException, InterruptedException, ExecutionException {
    // Instantiates a client
    try (AutoMlClient client = AutoMlClient.create()) {
        // Get the complete path of the dataset.
        DatasetName datasetFullId = DatasetName.of(projectId, computeRegion, datasetId);
        GcsSource.Builder gcsSource = GcsSource.newBuilder();
        // Get multiple Google Cloud Storage URIs to import data from
        String[] inputUris = path.split(",");
        for (String inputUri : inputUris) {
            gcsSource.addInputUris(inputUri);
        }
        // Import data from the input URI
        InputConfig inputConfig = InputConfig.newBuilder().setGcsSource(gcsSource).build();
        System.out.println("Processing import...");
        Empty response = client.importDataAsync(datasetFullId, inputConfig).get();
        System.out.println(String.format("Dataset imported. %s", response));
    }
}
Also used : Empty(com.google.protobuf.Empty) GcsSource(com.google.cloud.automl.v1.GcsSource) DatasetName(com.google.cloud.automl.v1.DatasetName) InputConfig(com.google.cloud.automl.v1.InputConfig) AutoMlClient(com.google.cloud.automl.v1.AutoMlClient)

Example 15 with InputConfig

use of com.google.cloud.datalabeling.v1beta1.InputConfig in project java-translate by googleapis.

the class BatchTranslateTextWithModel method batchTranslateTextWithModel.

// Batch translate text using AutoML Translation model
public static void batchTranslateTextWithModel(String projectId, String sourceLanguage, String targetLanguage, String inputUri, String outputUri, String modelId) throws IOException, ExecutionException, InterruptedException, TimeoutException {
    // the "close" method on the client to safely clean up any remaining background resources.
    try (TranslationServiceClient client = TranslationServiceClient.create()) {
        // Supported Locations: `global`, [glossary location], or [model location]
        // Glossaries must be hosted in `us-central1`
        // Custom Models must use the same location as your model. (us-central1)
        String location = "us-central1";
        LocationName parent = LocationName.of(projectId, location);
        // Configure the source of the file from a GCS bucket
        GcsSource gcsSource = GcsSource.newBuilder().setInputUri(inputUri).build();
        // Supported Mime Types: https://cloud.google.com/translate/docs/supported-formats
        InputConfig inputConfig = InputConfig.newBuilder().setGcsSource(gcsSource).setMimeType("text/plain").build();
        // Configure where to store the output in a GCS bucket
        GcsDestination gcsDestination = GcsDestination.newBuilder().setOutputUriPrefix(outputUri).build();
        OutputConfig outputConfig = OutputConfig.newBuilder().setGcsDestination(gcsDestination).build();
        // Configure the model used in the request
        String modelPath = String.format("projects/%s/locations/%s/models/%s", projectId, location, modelId);
        // Build the request that will be sent to the API
        BatchTranslateTextRequest request = BatchTranslateTextRequest.newBuilder().setParent(parent.toString()).setSourceLanguageCode(sourceLanguage).addTargetLanguageCodes(targetLanguage).addInputConfigs(inputConfig).setOutputConfig(outputConfig).putModels(targetLanguage, modelPath).build();
        // Start an asynchronous request
        OperationFuture<BatchTranslateResponse, BatchTranslateMetadata> future = client.batchTranslateTextAsync(request);
        System.out.println("Waiting for operation to complete...");
        // random number between 300 - 450 (maximum allowed seconds)
        long randomNumber = ThreadLocalRandom.current().nextInt(450, 600);
        BatchTranslateResponse response = future.get(randomNumber, TimeUnit.SECONDS);
        // Display the translation for each input text provided
        System.out.printf("Total Characters: %s\n", response.getTotalCharacters());
        System.out.printf("Translated Characters: %s\n", response.getTranslatedCharacters());
    }
}
Also used : BatchTranslateMetadata(com.google.cloud.translate.v3.BatchTranslateMetadata) TranslationServiceClient(com.google.cloud.translate.v3.TranslationServiceClient) GcsSource(com.google.cloud.translate.v3.GcsSource) OutputConfig(com.google.cloud.translate.v3.OutputConfig) BatchTranslateTextRequest(com.google.cloud.translate.v3.BatchTranslateTextRequest) InputConfig(com.google.cloud.translate.v3.InputConfig) GcsDestination(com.google.cloud.translate.v3.GcsDestination) BatchTranslateResponse(com.google.cloud.translate.v3.BatchTranslateResponse) LocationName(com.google.cloud.translate.v3.LocationName)

Aggregations

Document (com.google.cloud.documentai.v1beta2.Document)7 DocumentUnderstandingServiceClient (com.google.cloud.documentai.v1beta2.DocumentUnderstandingServiceClient)7 GcsSource (com.google.cloud.documentai.v1beta2.GcsSource)7 InputConfig (com.google.cloud.documentai.v1beta2.InputConfig)7 ProcessDocumentRequest (com.google.cloud.documentai.v1beta2.ProcessDocumentRequest)7 InputConfig (com.google.cloud.vision.v1.InputConfig)6 BatchTranslateMetadata (com.google.cloud.translate.v3.BatchTranslateMetadata)4 BatchTranslateResponse (com.google.cloud.translate.v3.BatchTranslateResponse)4 BatchTranslateTextRequest (com.google.cloud.translate.v3.BatchTranslateTextRequest)4 GcsDestination (com.google.cloud.translate.v3.GcsDestination)4 GcsSource (com.google.cloud.translate.v3.GcsSource)4 InputConfig (com.google.cloud.translate.v3.InputConfig)4 LocationName (com.google.cloud.translate.v3.LocationName)4 OutputConfig (com.google.cloud.translate.v3.OutputConfig)4 TranslationServiceClient (com.google.cloud.translate.v3.TranslationServiceClient)4 ByteString (com.google.protobuf.ByteString)4 AnnotateImageResponse (com.google.cloud.vision.v1.AnnotateImageResponse)3 Feature (com.google.cloud.vision.v1.Feature)3 GcsSource (com.google.cloud.vision.v1.GcsSource)3 ImageAnnotatorClient (com.google.cloud.vision.v1.ImageAnnotatorClient)3