Search in sources :

Example 91 with Endpoint

use of com.google.cloud.aiplatform.v1.Endpoint in project java-aiplatform by googleapis.

the class DeployModelCustomTrainedModelSample method deployModelCustomTrainedModelSample.

static void deployModelCustomTrainedModelSample(String project, String endpointId, String model, String deployedModelDisplayName) throws IOException, ExecutionException, InterruptedException {
    EndpointServiceSettings settings = EndpointServiceSettings.newBuilder().setEndpoint("us-central1-aiplatform.googleapis.com:443").build();
    String location = "us-central1";
    // the "close" method on the client to safely clean up any remaining background resources.
    try (EndpointServiceClient client = EndpointServiceClient.create(settings)) {
        MachineSpec machineSpec = MachineSpec.newBuilder().setMachineType("n1-standard-2").build();
        DedicatedResources dedicatedResources = DedicatedResources.newBuilder().setMinReplicaCount(1).setMachineSpec(machineSpec).build();
        String modelName = ModelName.of(project, location, model).toString();
        DeployedModel deployedModel = DeployedModel.newBuilder().setModel(modelName).setDisplayName(deployedModelDisplayName).setDedicatedResources(dedicatedResources).build();
        // key '0' assigns traffic for the newly deployed model
        // Traffic percentage values must add up to 100
        // Leave dictionary empty if endpoint should not accept any traffic
        Map<String, Integer> trafficSplit = new HashMap<>();
        trafficSplit.put("0", 100);
        EndpointName endpoint = EndpointName.of(project, location, endpointId);
        OperationFuture<DeployModelResponse, DeployModelOperationMetadata> response = client.deployModelAsync(endpoint, deployedModel, trafficSplit);
        // You can use OperationFuture.getInitialFuture to get a future representing the initial
        // response to the request, which contains information while the operation is in progress.
        System.out.format("Operation name: %s\n", response.getInitialFuture().get().getName());
        // OperationFuture.get() will block until the operation is finished.
        DeployModelResponse deployModelResponse = response.get();
        System.out.format("deployModelResponse: %s\n", deployModelResponse);
    }
}
Also used : HashMap(java.util.HashMap) DedicatedResources(com.google.cloud.aiplatform.v1.DedicatedResources) MachineSpec(com.google.cloud.aiplatform.v1.MachineSpec) DeployedModel(com.google.cloud.aiplatform.v1.DeployedModel) DeployModelResponse(com.google.cloud.aiplatform.v1.DeployModelResponse) DeployModelOperationMetadata(com.google.cloud.aiplatform.v1.DeployModelOperationMetadata) EndpointName(com.google.cloud.aiplatform.v1.EndpointName) EndpointServiceClient(com.google.cloud.aiplatform.v1.EndpointServiceClient) EndpointServiceSettings(com.google.cloud.aiplatform.v1.EndpointServiceSettings)

Example 92 with Endpoint

use of com.google.cloud.aiplatform.v1.Endpoint in project java-aiplatform by googleapis.

the class DeployModelSample method deployModelSample.

static void deployModelSample(String project, String deployedModelDisplayName, String endpointId, String modelId) throws IOException, InterruptedException, ExecutionException, TimeoutException {
    EndpointServiceSettings endpointServiceSettings = EndpointServiceSettings.newBuilder().setEndpoint("us-central1-aiplatform.googleapis.com:443").build();
    // the "close" method on the client to safely clean up any remaining background resources.
    try (EndpointServiceClient endpointServiceClient = EndpointServiceClient.create(endpointServiceSettings)) {
        String location = "us-central1";
        EndpointName endpointName = EndpointName.of(project, location, endpointId);
        // key '0' assigns traffic for the newly deployed model
        // Traffic percentage values must add up to 100
        // Leave dictionary empty if endpoint should not accept any traffic
        Map<String, Integer> trafficSplit = new HashMap<>();
        trafficSplit.put("0", 100);
        ModelName modelName = ModelName.of(project, location, modelId);
        AutomaticResources automaticResourcesInput = AutomaticResources.newBuilder().setMinReplicaCount(1).setMaxReplicaCount(1).build();
        DeployedModel deployedModelInput = DeployedModel.newBuilder().setModel(modelName.toString()).setDisplayName(deployedModelDisplayName).setAutomaticResources(automaticResourcesInput).build();
        OperationFuture<DeployModelResponse, DeployModelOperationMetadata> deployModelResponseFuture = endpointServiceClient.deployModelAsync(endpointName, deployedModelInput, trafficSplit);
        System.out.format("Operation name: %s\n", deployModelResponseFuture.getInitialFuture().get().getName());
        System.out.println("Waiting for operation to finish...");
        DeployModelResponse deployModelResponse = deployModelResponseFuture.get(20, TimeUnit.MINUTES);
        System.out.println("Deploy Model Response");
        DeployedModel deployedModel = deployModelResponse.getDeployedModel();
        System.out.println("\tDeployed Model");
        System.out.format("\t\tid: %s\n", deployedModel.getId());
        System.out.format("\t\tmodel: %s\n", deployedModel.getModel());
        System.out.format("\t\tDisplay Name: %s\n", deployedModel.getDisplayName());
        System.out.format("\t\tCreate Time: %s\n", deployedModel.getCreateTime());
        DedicatedResources dedicatedResources = deployedModel.getDedicatedResources();
        System.out.println("\t\tDedicated Resources");
        System.out.format("\t\t\tMin Replica Count: %s\n", dedicatedResources.getMinReplicaCount());
        MachineSpec machineSpec = dedicatedResources.getMachineSpec();
        System.out.println("\t\t\tMachine Spec");
        System.out.format("\t\t\t\tMachine Type: %s\n", machineSpec.getMachineType());
        System.out.format("\t\t\t\tAccelerator Type: %s\n", machineSpec.getAcceleratorType());
        System.out.format("\t\t\t\tAccelerator Count: %s\n", machineSpec.getAcceleratorCount());
        AutomaticResources automaticResources = deployedModel.getAutomaticResources();
        System.out.println("\t\tAutomatic Resources");
        System.out.format("\t\t\tMin Replica Count: %s\n", automaticResources.getMinReplicaCount());
        System.out.format("\t\t\tMax Replica Count: %s\n", automaticResources.getMaxReplicaCount());
    }
}
Also used : ModelName(com.google.cloud.aiplatform.v1.ModelName) HashMap(java.util.HashMap) DedicatedResources(com.google.cloud.aiplatform.v1.DedicatedResources) MachineSpec(com.google.cloud.aiplatform.v1.MachineSpec) DeployedModel(com.google.cloud.aiplatform.v1.DeployedModel) DeployModelResponse(com.google.cloud.aiplatform.v1.DeployModelResponse) DeployModelOperationMetadata(com.google.cloud.aiplatform.v1.DeployModelOperationMetadata) EndpointName(com.google.cloud.aiplatform.v1.EndpointName) AutomaticResources(com.google.cloud.aiplatform.v1.AutomaticResources) EndpointServiceClient(com.google.cloud.aiplatform.v1.EndpointServiceClient) EndpointServiceSettings(com.google.cloud.aiplatform.v1.EndpointServiceSettings)

Example 93 with Endpoint

use of com.google.cloud.aiplatform.v1.Endpoint in project java-aiplatform by googleapis.

the class EndpointServiceClientTest method listEndpointsTest2.

@Test
public void listEndpointsTest2() throws Exception {
    Endpoint responsesElement = Endpoint.newBuilder().build();
    ListEndpointsResponse expectedResponse = ListEndpointsResponse.newBuilder().setNextPageToken("").addAllEndpoints(Arrays.asList(responsesElement)).build();
    mockEndpointService.addResponse(expectedResponse);
    String parent = "parent-995424086";
    ListEndpointsPagedResponse pagedListResponse = client.listEndpoints(parent);
    List<Endpoint> resources = Lists.newArrayList(pagedListResponse.iterateAll());
    Assert.assertEquals(1, resources.size());
    Assert.assertEquals(expectedResponse.getEndpointsList().get(0), resources.get(0));
    List<AbstractMessage> actualRequests = mockEndpointService.getRequests();
    Assert.assertEquals(1, actualRequests.size());
    ListEndpointsRequest actualRequest = ((ListEndpointsRequest) actualRequests.get(0));
    Assert.assertEquals(parent, actualRequest.getParent());
    Assert.assertTrue(channelProvider.isHeaderSent(ApiClientHeaderProvider.getDefaultApiClientHeaderKey(), GaxGrpcProperties.getDefaultApiClientHeaderPattern()));
}
Also used : AbstractMessage(com.google.protobuf.AbstractMessage) ListEndpointsPagedResponse(com.google.cloud.aiplatform.v1.EndpointServiceClient.ListEndpointsPagedResponse) Test(org.junit.Test)

Example 94 with Endpoint

use of com.google.cloud.aiplatform.v1.Endpoint in project java-aiplatform by googleapis.

the class CreateTrainingPipelineImageClassificationSample method createTrainingPipelineImageClassificationSample.

static void createTrainingPipelineImageClassificationSample(String project, String trainingPipelineDisplayName, String datasetId, String modelDisplayName) throws IOException {
    PipelineServiceSettings pipelineServiceSettings = PipelineServiceSettings.newBuilder().setEndpoint("us-central1-aiplatform.googleapis.com:443").build();
    // the "close" method on the client to safely clean up any remaining background resources.
    try (PipelineServiceClient pipelineServiceClient = PipelineServiceClient.create(pipelineServiceSettings)) {
        String location = "us-central1";
        String trainingTaskDefinition = "gs://google-cloud-aiplatform/schema/trainingjob/definition/" + "automl_image_classification_1.0.0.yaml";
        LocationName locationName = LocationName.of(project, location);
        AutoMlImageClassificationInputs autoMlImageClassificationInputs = AutoMlImageClassificationInputs.newBuilder().setModelType(ModelType.CLOUD).setMultiLabel(false).setBudgetMilliNodeHours(8000).setDisableEarlyStopping(false).build();
        InputDataConfig trainingInputDataConfig = InputDataConfig.newBuilder().setDatasetId(datasetId).build();
        Model model = Model.newBuilder().setDisplayName(modelDisplayName).build();
        TrainingPipeline trainingPipeline = TrainingPipeline.newBuilder().setDisplayName(trainingPipelineDisplayName).setTrainingTaskDefinition(trainingTaskDefinition).setTrainingTaskInputs(ValueConverter.toValue(autoMlImageClassificationInputs)).setInputDataConfig(trainingInputDataConfig).setModelToUpload(model).build();
        TrainingPipeline trainingPipelineResponse = pipelineServiceClient.createTrainingPipeline(locationName, trainingPipeline);
        System.out.println("Create Training Pipeline Image Classification Response");
        System.out.format("Name: %s\n", trainingPipelineResponse.getName());
        System.out.format("Display Name: %s\n", trainingPipelineResponse.getDisplayName());
        System.out.format("Training Task Definition %s\n", trainingPipelineResponse.getTrainingTaskDefinition());
        System.out.format("Training Task Inputs: %s\n", trainingPipelineResponse.getTrainingTaskInputs());
        System.out.format("Training Task Metadata: %s\n", trainingPipelineResponse.getTrainingTaskMetadata());
        System.out.format("State: %s\n", trainingPipelineResponse.getState());
        System.out.format("Create Time: %s\n", trainingPipelineResponse.getCreateTime());
        System.out.format("StartTime %s\n", trainingPipelineResponse.getStartTime());
        System.out.format("End Time: %s\n", trainingPipelineResponse.getEndTime());
        System.out.format("Update Time: %s\n", trainingPipelineResponse.getUpdateTime());
        System.out.format("Labels: %s\n", trainingPipelineResponse.getLabelsMap());
        InputDataConfig inputDataConfig = trainingPipelineResponse.getInputDataConfig();
        System.out.println("Input Data Config");
        System.out.format("Dataset Id: %s", inputDataConfig.getDatasetId());
        System.out.format("Annotations Filter: %s\n", inputDataConfig.getAnnotationsFilter());
        FractionSplit fractionSplit = inputDataConfig.getFractionSplit();
        System.out.println("Fraction Split");
        System.out.format("Training Fraction: %s\n", fractionSplit.getTrainingFraction());
        System.out.format("Validation Fraction: %s\n", fractionSplit.getValidationFraction());
        System.out.format("Test Fraction: %s\n", fractionSplit.getTestFraction());
        FilterSplit filterSplit = inputDataConfig.getFilterSplit();
        System.out.println("Filter Split");
        System.out.format("Training Filter: %s\n", filterSplit.getTrainingFilter());
        System.out.format("Validation Filter: %s\n", filterSplit.getValidationFilter());
        System.out.format("Test Filter: %s\n", filterSplit.getTestFilter());
        PredefinedSplit predefinedSplit = inputDataConfig.getPredefinedSplit();
        System.out.println("Predefined Split");
        System.out.format("Key: %s\n", predefinedSplit.getKey());
        TimestampSplit timestampSplit = inputDataConfig.getTimestampSplit();
        System.out.println("Timestamp Split");
        System.out.format("Training Fraction: %s\n", timestampSplit.getTrainingFraction());
        System.out.format("Validation Fraction: %s\n", timestampSplit.getValidationFraction());
        System.out.format("Test Fraction: %s\n", timestampSplit.getTestFraction());
        System.out.format("Key: %s\n", timestampSplit.getKey());
        Model modelResponse = trainingPipelineResponse.getModelToUpload();
        System.out.println("Model To Upload");
        System.out.format("Name: %s\n", modelResponse.getName());
        System.out.format("Display Name: %s\n", modelResponse.getDisplayName());
        System.out.format("Description: %s\n", modelResponse.getDescription());
        System.out.format("Metadata Schema Uri: %s\n", modelResponse.getMetadataSchemaUri());
        System.out.format("Metadata: %s\n", modelResponse.getMetadata());
        System.out.format("Training Pipeline: %s\n", modelResponse.getTrainingPipeline());
        System.out.format("Artifact Uri: %s\n", modelResponse.getArtifactUri());
        System.out.format("Supported Deployment Resources Types: %s\n", modelResponse.getSupportedDeploymentResourcesTypesList());
        System.out.format("Supported Input Storage Formats: %s\n", modelResponse.getSupportedInputStorageFormatsList());
        System.out.format("Supported Output Storage Formats: %s\n", modelResponse.getSupportedOutputStorageFormatsList());
        System.out.format("Create Time: %s\n", modelResponse.getCreateTime());
        System.out.format("Update Time: %s\n", modelResponse.getUpdateTime());
        System.out.format("Labels: %sn\n", modelResponse.getLabelsMap());
        PredictSchemata predictSchemata = modelResponse.getPredictSchemata();
        System.out.println("Predict Schemata");
        System.out.format("Instance Schema Uri: %s\n", predictSchemata.getInstanceSchemaUri());
        System.out.format("Parameters Schema Uri: %s\n", predictSchemata.getParametersSchemaUri());
        System.out.format("Prediction Schema Uri: %s\n", predictSchemata.getPredictionSchemaUri());
        for (ExportFormat exportFormat : modelResponse.getSupportedExportFormatsList()) {
            System.out.println("Supported Export Format");
            System.out.format("Id: %s\n", exportFormat.getId());
        }
        ModelContainerSpec modelContainerSpec = modelResponse.getContainerSpec();
        System.out.println("Container Spec");
        System.out.format("Image Uri: %s\n", modelContainerSpec.getImageUri());
        System.out.format("Command: %s\n", modelContainerSpec.getCommandList());
        System.out.format("Args: %s\n", modelContainerSpec.getArgsList());
        System.out.format("Predict Route: %s\n", modelContainerSpec.getPredictRoute());
        System.out.format("Health Route: %s\n", modelContainerSpec.getHealthRoute());
        for (EnvVar envVar : modelContainerSpec.getEnvList()) {
            System.out.println("Env");
            System.out.format("Name: %s\n", envVar.getName());
            System.out.format("Value: %s\n", envVar.getValue());
        }
        for (Port port : modelContainerSpec.getPortsList()) {
            System.out.println("Port");
            System.out.format("Container Port: %s\n", port.getContainerPort());
        }
        for (DeployedModelRef deployedModelRef : modelResponse.getDeployedModelsList()) {
            System.out.println("Deployed Model");
            System.out.format("Endpoint: %s\n", deployedModelRef.getEndpoint());
            System.out.format("Deployed Model Id: %s\n", deployedModelRef.getDeployedModelId());
        }
        Status status = trainingPipelineResponse.getError();
        System.out.println("Error");
        System.out.format("Code: %s\n", status.getCode());
        System.out.format("Message: %s\n", status.getMessage());
    }
}
Also used : Status(com.google.rpc.Status) PredictSchemata(com.google.cloud.aiplatform.v1.PredictSchemata) TrainingPipeline(com.google.cloud.aiplatform.v1.TrainingPipeline) AutoMlImageClassificationInputs(com.google.cloud.aiplatform.v1.schema.trainingjob.definition.AutoMlImageClassificationInputs) TimestampSplit(com.google.cloud.aiplatform.v1.TimestampSplit) Port(com.google.cloud.aiplatform.v1.Port) ExportFormat(com.google.cloud.aiplatform.v1.Model.ExportFormat) InputDataConfig(com.google.cloud.aiplatform.v1.InputDataConfig) LocationName(com.google.cloud.aiplatform.v1.LocationName) PredefinedSplit(com.google.cloud.aiplatform.v1.PredefinedSplit) FilterSplit(com.google.cloud.aiplatform.v1.FilterSplit) FractionSplit(com.google.cloud.aiplatform.v1.FractionSplit) ModelContainerSpec(com.google.cloud.aiplatform.v1.ModelContainerSpec) DeployedModelRef(com.google.cloud.aiplatform.v1.DeployedModelRef) Model(com.google.cloud.aiplatform.v1.Model) PipelineServiceSettings(com.google.cloud.aiplatform.v1.PipelineServiceSettings) EnvVar(com.google.cloud.aiplatform.v1.EnvVar) PipelineServiceClient(com.google.cloud.aiplatform.v1.PipelineServiceClient)

Example 95 with Endpoint

use of com.google.cloud.aiplatform.v1.Endpoint in project java-aiplatform by googleapis.

the class CreateTrainingPipelineSample method createTrainingPipelineSample.

static void createTrainingPipelineSample(String project, String trainingPipelineDisplayName, String datasetId, String trainingTaskDefinition, String modelDisplayName) throws IOException {
    PipelineServiceSettings pipelineServiceSettings = PipelineServiceSettings.newBuilder().setEndpoint("us-central1-aiplatform.googleapis.com:443").build();
    // the "close" method on the client to safely clean up any remaining background resources.
    try (PipelineServiceClient pipelineServiceClient = PipelineServiceClient.create(pipelineServiceSettings)) {
        String location = "us-central1";
        LocationName locationName = LocationName.of(project, location);
        String jsonString = "{\"multiLabel\": false, \"modelType\": \"CLOUD\", \"budgetMilliNodeHours\": 8000," + " \"disableEarlyStopping\": false}";
        Value.Builder trainingTaskInputs = Value.newBuilder();
        JsonFormat.parser().merge(jsonString, trainingTaskInputs);
        InputDataConfig trainingInputDataConfig = InputDataConfig.newBuilder().setDatasetId(datasetId).build();
        Model model = Model.newBuilder().setDisplayName(modelDisplayName).build();
        TrainingPipeline trainingPipeline = TrainingPipeline.newBuilder().setDisplayName(trainingPipelineDisplayName).setTrainingTaskDefinition(trainingTaskDefinition).setTrainingTaskInputs(trainingTaskInputs).setInputDataConfig(trainingInputDataConfig).setModelToUpload(model).build();
        TrainingPipeline trainingPipelineResponse = pipelineServiceClient.createTrainingPipeline(locationName, trainingPipeline);
        System.out.println("Create Training Pipeline Response");
        System.out.format("Name: %s\n", trainingPipelineResponse.getName());
        System.out.format("Display Name: %s\n", trainingPipelineResponse.getDisplayName());
        System.out.format("Training Task Definition %s\n", trainingPipelineResponse.getTrainingTaskDefinition());
        System.out.format("Training Task Inputs: %s\n", trainingPipelineResponse.getTrainingTaskInputs());
        System.out.format("Training Task Metadata: %s\n", trainingPipelineResponse.getTrainingTaskMetadata());
        System.out.format("State: %s\n", trainingPipelineResponse.getState());
        System.out.format("Create Time: %s\n", trainingPipelineResponse.getCreateTime());
        System.out.format("StartTime %s\n", trainingPipelineResponse.getStartTime());
        System.out.format("End Time: %s\n", trainingPipelineResponse.getEndTime());
        System.out.format("Update Time: %s\n", trainingPipelineResponse.getUpdateTime());
        System.out.format("Labels: %s\n", trainingPipelineResponse.getLabelsMap());
        InputDataConfig inputDataConfig = trainingPipelineResponse.getInputDataConfig();
        System.out.println("Input Data Config");
        System.out.format("Dataset Id: %s", inputDataConfig.getDatasetId());
        System.out.format("Annotations Filter: %s\n", inputDataConfig.getAnnotationsFilter());
        FractionSplit fractionSplit = inputDataConfig.getFractionSplit();
        System.out.println("Fraction Split");
        System.out.format("Training Fraction: %s\n", fractionSplit.getTrainingFraction());
        System.out.format("Validation Fraction: %s\n", fractionSplit.getValidationFraction());
        System.out.format("Test Fraction: %s\n", fractionSplit.getTestFraction());
        FilterSplit filterSplit = inputDataConfig.getFilterSplit();
        System.out.println("Filter Split");
        System.out.format("Training Filter: %s\n", filterSplit.getTrainingFilter());
        System.out.format("Validation Filter: %s\n", filterSplit.getValidationFilter());
        System.out.format("Test Filter: %s\n", filterSplit.getTestFilter());
        PredefinedSplit predefinedSplit = inputDataConfig.getPredefinedSplit();
        System.out.println("Predefined Split");
        System.out.format("Key: %s\n", predefinedSplit.getKey());
        TimestampSplit timestampSplit = inputDataConfig.getTimestampSplit();
        System.out.println("Timestamp Split");
        System.out.format("Training Fraction: %s\n", timestampSplit.getTrainingFraction());
        System.out.format("Validation Fraction: %s\n", timestampSplit.getValidationFraction());
        System.out.format("Test Fraction: %s\n", timestampSplit.getTestFraction());
        System.out.format("Key: %s\n", timestampSplit.getKey());
        Model modelResponse = trainingPipelineResponse.getModelToUpload();
        System.out.println("Model To Upload");
        System.out.format("Name: %s\n", modelResponse.getName());
        System.out.format("Display Name: %s\n", modelResponse.getDisplayName());
        System.out.format("Description: %s\n", modelResponse.getDescription());
        System.out.format("Metadata Schema Uri: %s\n", modelResponse.getMetadataSchemaUri());
        System.out.format("Metadata: %s\n", modelResponse.getMetadata());
        System.out.format("Training Pipeline: %s\n", modelResponse.getTrainingPipeline());
        System.out.format("Artifact Uri: %s\n", modelResponse.getArtifactUri());
        System.out.format("Supported Deployment Resources Types: %s\n", modelResponse.getSupportedDeploymentResourcesTypesList());
        System.out.format("Supported Input Storage Formats: %s\n", modelResponse.getSupportedInputStorageFormatsList());
        System.out.format("Supported Output Storage Formats: %s\n", modelResponse.getSupportedOutputStorageFormatsList());
        System.out.format("Create Time: %s\n", modelResponse.getCreateTime());
        System.out.format("Update Time: %s\n", modelResponse.getUpdateTime());
        System.out.format("Labels: %sn\n", modelResponse.getLabelsMap());
        PredictSchemata predictSchemata = modelResponse.getPredictSchemata();
        System.out.println("Predict Schemata");
        System.out.format("Instance Schema Uri: %s\n", predictSchemata.getInstanceSchemaUri());
        System.out.format("Parameters Schema Uri: %s\n", predictSchemata.getParametersSchemaUri());
        System.out.format("Prediction Schema Uri: %s\n", predictSchemata.getPredictionSchemaUri());
        for (ExportFormat exportFormat : modelResponse.getSupportedExportFormatsList()) {
            System.out.println("Supported Export Format");
            System.out.format("Id: %s\n", exportFormat.getId());
        }
        ModelContainerSpec modelContainerSpec = modelResponse.getContainerSpec();
        System.out.println("Container Spec");
        System.out.format("Image Uri: %s\n", modelContainerSpec.getImageUri());
        System.out.format("Command: %s\n", modelContainerSpec.getCommandList());
        System.out.format("Args: %s\n", modelContainerSpec.getArgsList());
        System.out.format("Predict Route: %s\n", modelContainerSpec.getPredictRoute());
        System.out.format("Health Route: %s\n", modelContainerSpec.getHealthRoute());
        for (EnvVar envVar : modelContainerSpec.getEnvList()) {
            System.out.println("Env");
            System.out.format("Name: %s\n", envVar.getName());
            System.out.format("Value: %s\n", envVar.getValue());
        }
        for (Port port : modelContainerSpec.getPortsList()) {
            System.out.println("Port");
            System.out.format("Container Port: %s\n", port.getContainerPort());
        }
        for (DeployedModelRef deployedModelRef : modelResponse.getDeployedModelsList()) {
            System.out.println("Deployed Model");
            System.out.format("Endpoint: %s\n", deployedModelRef.getEndpoint());
            System.out.format("Deployed Model Id: %s\n", deployedModelRef.getDeployedModelId());
        }
        Status status = trainingPipelineResponse.getError();
        System.out.println("Error");
        System.out.format("Code: %s\n", status.getCode());
        System.out.format("Message: %s\n", status.getMessage());
    }
}
Also used : Status(com.google.rpc.Status) PredictSchemata(com.google.cloud.aiplatform.v1.PredictSchemata) TrainingPipeline(com.google.cloud.aiplatform.v1.TrainingPipeline) TimestampSplit(com.google.cloud.aiplatform.v1.TimestampSplit) Port(com.google.cloud.aiplatform.v1.Port) ExportFormat(com.google.cloud.aiplatform.v1.Model.ExportFormat) InputDataConfig(com.google.cloud.aiplatform.v1.InputDataConfig) LocationName(com.google.cloud.aiplatform.v1.LocationName) PredefinedSplit(com.google.cloud.aiplatform.v1.PredefinedSplit) FilterSplit(com.google.cloud.aiplatform.v1.FilterSplit) FractionSplit(com.google.cloud.aiplatform.v1.FractionSplit) ModelContainerSpec(com.google.cloud.aiplatform.v1.ModelContainerSpec) DeployedModelRef(com.google.cloud.aiplatform.v1.DeployedModelRef) Value(com.google.protobuf.Value) Model(com.google.cloud.aiplatform.v1.Model) PipelineServiceSettings(com.google.cloud.aiplatform.v1.PipelineServiceSettings) EnvVar(com.google.cloud.aiplatform.v1.EnvVar) PipelineServiceClient(com.google.cloud.aiplatform.v1.PipelineServiceClient)

Aggregations

Endpoint (zipkin2.Endpoint)73 Span (zipkin2.Span)33 Test (org.junit.Test)28 Endpoint (org.jboss.remoting3.Endpoint)22 Test (org.junit.jupiter.api.Test)20 V1Span (zipkin2.v1.V1Span)16 NoopHealthCheckManager (com.wavefront.agent.channel.NoopHealthCheckManager)10 SpanSampler (com.wavefront.agent.sampler.SpanSampler)10 ByteBuf (io.netty.buffer.ByteBuf)10 ChannelHandlerContext (io.netty.channel.ChannelHandlerContext)10 DefaultFullHttpRequest (io.netty.handler.codec.http.DefaultFullHttpRequest)10 FullHttpRequest (io.netty.handler.codec.http.FullHttpRequest)10 Span (wavefront.report.Span)10 IOException (java.io.IOException)8 URI (java.net.URI)8 HashMap (java.util.HashMap)8 Annotation (wavefront.report.Annotation)8 ServiceName (org.jboss.msc.service.ServiceName)7 RateSampler (com.wavefront.sdk.entities.tracing.sampling.RateSampler)6 SaslAuthenticationFactory (org.wildfly.security.auth.server.SaslAuthenticationFactory)6