Struct gapi_grpc::google::cloud::aiplatform::v1::ModelContainerSpec [−][src]
Specification of a container for serving predictions. Some fields in this message correspond to fields in the Kubernetes Container v1 core specification.
Fields
image_uri: String
Required. Immutable. URI of the Docker image to be used as the custom container for serving predictions. This URI must identify an image in Artifact Registry or Container Registry. Learn more about the container publishing requirements, including permissions requirements for the AI Platform Service Agent.
The container image is ingested upon [ModelService.UploadModel][google.cloud.aiplatform.v1.ModelService.UploadModel], stored internally, and this original path is afterwards not used.
To learn about the requirements for the Docker image itself, see Custom container requirements.
You can use the URI to one of Vertex AI’s pre-built container images for prediction in this field.
command: Vec<String>
Immutable. Specifies the command that runs when the container starts. This overrides
the container’s
ENTRYPOINT.
Specify this field as an array of executable and arguments, similar to a
Docker ENTRYPOINT
’s “exec” form, not its “shell” form.
If you do not specify this field, then the container’s ENTRYPOINT
runs,
in conjunction with the [args][google.cloud.aiplatform.v1.ModelContainerSpec.args] field or the
container’s CMD
,
if either exists. If this field is not specified and the container does not
have an ENTRYPOINT
, then refer to the Docker documentation about how
CMD
and ENTRYPOINT
interact.
If you specify this field, then you can also specify the args
field to
provide additional arguments for this command. However, if you specify this
field, then the container’s CMD
is ignored. See the
Kubernetes documentation about how the
command
and args
fields interact with a container’s ENTRYPOINT
and
CMD
.
In this field, you can reference environment variables set by Vertex
AI
and environment variables set in the [env][google.cloud.aiplatform.v1.ModelContainerSpec.env] field.
You cannot reference environment variables set in the Docker image. In
order for environment variables to be expanded, reference them by using the
following syntax:
$(VARIABLE_NAME)
Note that this differs from Bash variable expansion, which does not use
parentheses. If a variable cannot be resolved, the reference in the input
string is used unchanged. To avoid variable expansion, you can escape this
syntax with $$
; for example:
$$(VARIABLE_NAME)
This field corresponds to the command
field of the Kubernetes Containers
v1 core
API.
args: Vec<String>
Immutable. Specifies arguments for the command that runs when the container starts.
This overrides the container’s
CMD
. Specify
this field as an array of executable and arguments, similar to a Docker
CMD
’s “default parameters” form.
If you don’t specify this field but do specify the
[command][google.cloud.aiplatform.v1.ModelContainerSpec.command] field, then the command from the
command
field runs without any additional arguments. See the
Kubernetes documentation about how the
command
and args
fields interact with a container’s ENTRYPOINT
and
CMD
.
If you don’t specify this field and don’t specify the command
field,
then the container’s
ENTRYPOINT
and
CMD
determine what runs based on their default behavior. See the Docker
documentation about how CMD
and ENTRYPOINT
interact.
In this field, you can reference environment variables
set by Vertex
AI
and environment variables set in the [env][google.cloud.aiplatform.v1.ModelContainerSpec.env] field.
You cannot reference environment variables set in the Docker image. In
order for environment variables to be expanded, reference them by using the
following syntax:
$(VARIABLE_NAME)
Note that this differs from Bash variable expansion, which does not use
parentheses. If a variable cannot be resolved, the reference in the input
string is used unchanged. To avoid variable expansion, you can escape this
syntax with $$
; for example:
$$(VARIABLE_NAME)
This field corresponds to the args
field of the Kubernetes Containers
v1 core
API.
env: Vec<EnvVar>
Immutable. List of environment variables to set in the container. After the container starts running, code running in the container can read these environment variables.
Additionally, the [command][google.cloud.aiplatform.v1.ModelContainerSpec.command] and
[args][google.cloud.aiplatform.v1.ModelContainerSpec.args] fields can reference these variables. Later
entries in this list can also reference earlier entries. For example, the
following example sets the variable VAR_2
to have the value foo bar
:
[
{
"name": "VAR_1",
"value": "foo"
},
{
"name": "VAR_2",
"value": "$(VAR_1) bar"
}
]
If you switch the order of the variables in the example, then the expansion does not occur.
This field corresponds to the env
field of the Kubernetes Containers
v1 core
API.
ports: Vec<Port>
Immutable. List of ports to expose from the container. Vertex AI sends any prediction requests that it receives to the first port on this list. AI Platform also sends liveness and health checks to this port.
If you do not specify this field, it defaults to following value:
[
{
"containerPort": 8080
}
]
Vertex AI does not use ports other than the first one listed. This field
corresponds to the ports
field of the Kubernetes Containers
v1 core
API.
predict_route: String
Immutable. HTTP path on the container to send prediction requests to. Vertex AI forwards requests sent using [projects.locations.endpoints.predict][google.cloud.aiplatform.v1.PredictionService.Predict] to this path on the container’s IP address and port. Vertex AI then returns the container’s response in the API response.
For example, if you set this field to /foo
, then when Vertex AI
receives a prediction request, it forwards the request body in a POST
request to the /foo
path on the port of your container specified by the
first value of this ModelContainerSpec
’s
[ports][google.cloud.aiplatform.v1.ModelContainerSpec.ports] field.
If you don’t specify this field, it defaults to the following value when
you [deploy this Model to an Endpoint][google.cloud.aiplatform.v1.EndpointService.DeployModel]:
/v1/endpoints/ENDPOINT/deployedModels/DEPLOYED_MODEL:predict
The placeholders in this value are replaced as follows:
-
ENDPOINT: The last segment (following
endpoints/
)of the Endpoint.name][] field of the Endpoint where this Model has been deployed. (Vertex AI makes this value available to your container code as theAIP_ENDPOINT_ID
environment variable.) -
DEPLOYED_MODEL: [DeployedModel.id][google.cloud.aiplatform.v1.DeployedModel.id] of the
DeployedModel
. (Vertex AI makes this value available to your container code as theAIP_DEPLOYED_MODEL_ID
environment variable.)
health_route: String
Immutable. HTTP path on the container to send health checks to. Vertex AI intermittently sends GET requests to this path on the container’s IP address and port to check that the container is healthy. Read more about health checks.
For example, if you set this field to /bar
, then Vertex AI
intermittently sends a GET request to the /bar
path on the port of your
container specified by the first value of this ModelContainerSpec
’s
[ports][google.cloud.aiplatform.v1.ModelContainerSpec.ports] field.
If you don’t specify this field, it defaults to the following value when
you [deploy this Model to an Endpoint][google.cloud.aiplatform.v1.EndpointService.DeployModel]:
/v1/endpoints/ENDPOINT/deployedModels/DEPLOYED_MODEL:predict
The placeholders in this value are replaced as follows:
-
ENDPOINT: The last segment (following
endpoints/
)of the Endpoint.name][] field of the Endpoint where this Model has been deployed. (Vertex AI makes this value available to your container code as theAIP_ENDPOINT_ID
environment variable.) -
DEPLOYED_MODEL: [DeployedModel.id][google.cloud.aiplatform.v1.DeployedModel.id] of the
DeployedModel
. (Vertex AI makes this value available to your container code as theAIP_DEPLOYED_MODEL_ID
environment variable.)
Trait Implementations
impl Clone for ModelContainerSpec
[src]
fn clone(&self) -> ModelContainerSpec
[src]
pub fn clone_from(&mut self, source: &Self)
1.0.0[src]
impl Debug for ModelContainerSpec
[src]
impl Default for ModelContainerSpec
[src]
fn default() -> ModelContainerSpec
[src]
impl Message for ModelContainerSpec
[src]
fn encode_raw<B>(&self, buf: &mut B) where
B: BufMut,
[src]
B: BufMut,
fn merge_field<B>(
&mut self,
tag: u32,
wire_type: WireType,
buf: &mut B,
ctx: DecodeContext
) -> Result<(), DecodeError> where
B: Buf,
[src]
&mut self,
tag: u32,
wire_type: WireType,
buf: &mut B,
ctx: DecodeContext
) -> Result<(), DecodeError> where
B: Buf,
fn encoded_len(&self) -> usize
[src]
fn clear(&mut self)
[src]
pub fn encode<B>(&self, buf: &mut B) -> Result<(), EncodeError> where
B: BufMut,
[src]
B: BufMut,
pub fn encode_length_delimited<B>(&self, buf: &mut B) -> Result<(), EncodeError> where
B: BufMut,
[src]
B: BufMut,
pub fn decode<B>(buf: B) -> Result<Self, DecodeError> where
Self: Default,
B: Buf,
[src]
Self: Default,
B: Buf,
pub fn decode_length_delimited<B>(buf: B) -> Result<Self, DecodeError> where
Self: Default,
B: Buf,
[src]
Self: Default,
B: Buf,
pub fn merge<B>(&mut self, buf: B) -> Result<(), DecodeError> where
B: Buf,
[src]
B: Buf,
pub fn merge_length_delimited<B>(&mut self, buf: B) -> Result<(), DecodeError> where
B: Buf,
[src]
B: Buf,
impl PartialEq<ModelContainerSpec> for ModelContainerSpec
[src]
fn eq(&self, other: &ModelContainerSpec) -> bool
[src]
fn ne(&self, other: &ModelContainerSpec) -> bool
[src]
impl StructuralPartialEq for ModelContainerSpec
[src]
Auto Trait Implementations
impl RefUnwindSafe for ModelContainerSpec
impl Send for ModelContainerSpec
impl Sync for ModelContainerSpec
impl Unpin for ModelContainerSpec
impl UnwindSafe for ModelContainerSpec
Blanket Implementations
impl<T> Any for T where
T: 'static + ?Sized,
[src]
T: 'static + ?Sized,
impl<T> Borrow<T> for T where
T: ?Sized,
[src]
T: ?Sized,
impl<T> BorrowMut<T> for T where
T: ?Sized,
[src]
T: ?Sized,
pub fn borrow_mut(&mut self) -> &mut T
[src]
impl<T> From<T> for T
[src]
impl<T> Instrument for T
[src]
pub fn instrument(self, span: Span) -> Instrumented<Self>
[src]
pub fn in_current_span(self) -> Instrumented<Self>
[src]
impl<T> Instrument for T
[src]
pub fn instrument(self, span: Span) -> Instrumented<Self>
[src]
pub fn in_current_span(self) -> Instrumented<Self>
[src]
impl<T, U> Into<U> for T where
U: From<T>,
[src]
U: From<T>,
impl<T> IntoRequest<T> for T
[src]
pub fn into_request(self) -> Request<T>
[src]
impl<T> ToOwned for T where
T: Clone,
[src]
T: Clone,
type Owned = T
The resulting type after obtaining ownership.
pub fn to_owned(&self) -> T
[src]
pub fn clone_into(&self, target: &mut T)
[src]
impl<T, U> TryFrom<U> for T where
U: Into<T>,
[src]
U: Into<T>,
type Error = Infallible
The type returned in the event of a conversion error.
pub fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>
[src]
impl<T, U> TryInto<U> for T where
U: TryFrom<T>,
[src]
U: TryFrom<T>,
type Error = <U as TryFrom<T>>::Error
The type returned in the event of a conversion error.
pub fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>
[src]
impl<V, T> VZip<V> for T where
V: MultiLane<T>,
[src]
V: MultiLane<T>,
impl<T> WithSubscriber for T
[src]
pub fn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self> where
S: Into<Dispatch>,
[src]
S: Into<Dispatch>,