Struct gapi_grpc::google::cloud::aiplatform::v1beta1::explanation_metadata::InputMetadata[][src]

pub struct InputMetadata {
    pub input_baselines: Vec<Value>,
    pub input_tensor_name: String,
    pub encoding: i32,
    pub modality: String,
    pub feature_value_domain: Option<FeatureValueDomain>,
    pub indices_tensor_name: String,
    pub dense_shape_tensor_name: String,
    pub index_feature_mapping: Vec<String>,
    pub encoded_tensor_name: String,
    pub encoded_baselines: Vec<Value>,
    pub visualization: Option<Visualization>,
    pub group_name: String,
}

Metadata of the input of a feature.

Fields other than [InputMetadata.input_baselines][google.cloud.aiplatform.v1beta1.ExplanationMetadata.InputMetadata.input_baselines] are applicable only for Models that are using Vertex AI-provided images for Tensorflow.

Fields

input_baselines: Vec<Value>

Baseline inputs for this feature.

If no baseline is specified, Vertex AI chooses the baseline for this feature. If multiple baselines are specified, Vertex AI returns the average attributions across them in [Attributions.baseline_attribution][].

For Vertex AI-provided Tensorflow images (both 1.x and 2.x), the shape of each baseline must match the shape of the input tensor. If a scalar is provided, we broadcast to the same shape as the input tensor.

For custom images, the element of the baselines must be in the same format as the feature’s input in the [instance][google.cloud.aiplatform.v1beta1.ExplainRequest.instances][]. The schema of any single instance may be specified via Endpoint’s DeployedModels’ [Model’s][google.cloud.aiplatform.v1beta1.DeployedModel.model] [PredictSchemata’s][google.cloud.aiplatform.v1beta1.Model.predict_schemata] [instance_schema_uri][google.cloud.aiplatform.v1beta1.PredictSchemata.instance_schema_uri].

input_tensor_name: String

Name of the input tensor for this feature. Required and is only applicable to Vertex AI-provided images for Tensorflow.

encoding: i32

Defines how the feature is encoded into the input tensor. Defaults to IDENTITY.

modality: String

Modality of the feature. Valid values are: numeric, image. Defaults to numeric.

feature_value_domain: Option<FeatureValueDomain>

The domain details of the input feature value. Like min/max, original mean or standard deviation if normalized.

indices_tensor_name: String

Specifies the index of the values of the input tensor. Required when the input tensor is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.

dense_shape_tensor_name: String

Specifies the shape of the values of the input if the input is a sparse representation. Refer to Tensorflow documentation for more details: https://www.tensorflow.org/api_docs/python/tf/sparse/SparseTensor.

index_feature_mapping: Vec<String>

A list of feature names for each index in the input tensor. Required when the input [InputMetadata.encoding][google.cloud.aiplatform.v1beta1.ExplanationMetadata.InputMetadata.encoding] is BAG_OF_FEATURES, BAG_OF_FEATURES_SPARSE, INDICATOR.

encoded_tensor_name: String

Encoded tensor is a transformation of the input tensor. Must be provided if choosing [Integrated Gradients attribution][ExplanationParameters.integrated_gradients_attribution] or [XRAI attribution][google.cloud.aiplatform.v1beta1.ExplanationParameters.xrai_attribution] and the input tensor is not differentiable.

An encoded tensor is generated if the input tensor is encoded by a lookup table.

encoded_baselines: Vec<Value>

A list of baselines for the encoded tensor.

The shape of each baseline should match the shape of the encoded tensor. If a scalar is provided, Vertex AI broadcasts to the same shape as the encoded tensor.

visualization: Option<Visualization>

Visualization configurations for image explanation.

group_name: String

Name of the group that the input belongs to. Features with the same group name will be treated as one feature when computing attributions. Features grouped together can have different shapes in value. If provided, there will be one single attribution generated in [ featureAttributions][Attribution.feature_attributions], keyed by the group name.

Implementations

impl InputMetadata[src]

pub fn encoding(&self) -> Encoding[src]

Returns the enum value of encoding, or the default if the field is set to an invalid enum value.

pub fn set_encoding(&mut self, value: Encoding)[src]

Sets encoding to the provided enum value.

Trait Implementations

impl Clone for InputMetadata[src]

impl Debug for InputMetadata[src]

impl Default for InputMetadata[src]

impl Message for InputMetadata[src]

impl PartialEq<InputMetadata> for InputMetadata[src]

impl StructuralPartialEq for InputMetadata[src]

Auto Trait Implementations

impl RefUnwindSafe for InputMetadata

impl Send for InputMetadata

impl Sync for InputMetadata

impl Unpin for InputMetadata

impl UnwindSafe for InputMetadata

Blanket Implementations

impl<T> Any for T where
    T: 'static + ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> From<T> for T[src]

impl<T> Instrument for T[src]

impl<T> Instrument for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T> IntoRequest<T> for T[src]

impl<T> ToOwned for T where
    T: Clone
[src]

type Owned = T

The resulting type after obtaining ownership.

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.

impl<V, T> VZip<V> for T where
    V: MultiLane<T>, 
[src]

impl<T> WithSubscriber for T[src]