Azure Machine Learning Inference Cluster
This page shows how to write Terraform and Azure Resource Manager for Machine Learning Inference Cluster and write them securely.
azurerm_machine_learning_inference_cluster (Terraform)
The Inference Cluster in Machine Learning can be configured in Terraform with the resource name azurerm_machine_learning_inference_cluster
. The following sections describe how to use the resource and its parameters.
Example Usage from GitHub
An example could not be found in GitHub.
Parameters
The following arguments are supported:
name
- (Required) The name which should be used for this Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.kubernetes_cluster_id
- (Required) The ID of the Kubernetes Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.location
- (Required) The Azure Region where the Machine Learning Inference Cluster should exist. Changing this forces a new Machine Learning Inference Cluster to be created.machine_learning_workspace_id
- (Required) The ID of the Machine Learning Workspace. Changing this forces a new Machine Learning Inference Cluster to be created.
cluster_purpose
- (Optional) The purpose of the Inference Cluster. Options areDevTest
,DenseProd
andFastProd
. If used for Development or Testing, useDevTest
here. Default purpose isFastProd
, which is recommended for production workloads. Changing this forces a new Machine Learning Inference Cluster to be created.
NOTE: When creating or attaching a cluster, if the cluster will be used for production (
cluster_purpose = "FastProd"
), then it must contain at least 12 virtual CPUs. The number of virtual CPUs can be calculated by multiplying the number of nodes in the cluster by the number of cores provided by the VM size selected. For example, if you use a VM size of "Standard_D3_v2", which has 4 virtual cores, then you should select 3 or greater as the number of nodes.
description
- (Optional) The description of the Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.identity
- (Optional) Anidentity
block as defined below. Changing this forces a new Machine Learning Inference Cluster to be created.ssl
- (Optional) Assl
block as defined below. Changing this forces a new Machine Learning Inference Cluster to be created.tags
- (Optional) A mapping of tags which should be assigned to the Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.
A identity
block supports the following:
type
- (Required) The Type of Identity which should be used for this Machine Learning Inference Cluster. Possible values areSystemAssigned
,UserAssigned
andSystemAssigned,UserAssigned
. Changing this forces a new Machine Learning Inference Cluster to be created.identity_ids
- (Optional) A list of User Managed Identity ID's which should be assigned to the Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.
A ssl
block supports the following:
cert
- (Optional) The certificate for the ssl configuration.Conflicts withssl.0.leaf_domain_label
,ssl.0.overwrite_existing_domain
. Changing this forces a new Machine Learning Inference Cluster to be created.cname
- (Optional) The cname of the ssl configuration.Conflicts withssl.0.leaf_domain_label
,ssl.0.overwrite_existing_domain
. Changing this forces a new Machine Learning Inference Cluster to be created.key
- (Optional) The key content for the ssl configuration.Conflicts withssl.0.leaf_domain_label
,ssl.0.overwrite_existing_domain
. Changing this forces a new Machine Learning Inference Cluster to be created.leaf_domain_label
- (Optional) The leaf domain label for the ssl configuration. Conflicts withssl.0.cert
,ssl.0.key
,ssl.0.cname
. Changing this forces a new Machine Learning Inference Cluster to be created.overwrite_existing_domain
- (Optional) Whether or not to overwrite existing leaf domain. Conflicts withssl.0.cert
,ssl.0.key
,ssl.0.cname
Changing this forces a new Machine Learning Inference Cluster to be created.
In addition to the Arguments listed above - the following Attributes are exported:
id
- The ID of the Machine Learning Inference Cluster.identity
- Anidentity
block as defined below, which contains the Managed Service Identity information for this Machine Learning Inference Cluster.
A identity
block exports the following:
principal_id
- The Principal ID for the Service Principal associated with the Managed Service Identity of this Machine Learning Inference Cluster.tenant_id
- The Tenant ID for the Service Principal associated with the Managed Service Identity of this Machine Learning Inference Cluster.
Explanation in Terraform Registry
Manages a Machine Learning Inference Cluster.
NOTE: The Machine Learning Inference Cluster resource is used to attach an existing AKS cluster to the Machine Learning Workspace, it doesn't create the AKS cluster itself. Therefore it can only be created and deleted, not updated. Any change to the configuration will recreate the resource.
Microsoft.MachineLearningServices/workspaces/computes (Azure Resource Manager)
The workspaces/computes in Microsoft.MachineLearningServices can be configured in Azure Resource Manager with the resource name Microsoft.MachineLearningServices/workspaces/computes
. The following sections describe how to use the resource and its parameters.
Example Usage from GitHub
An example could not be found in GitHub.
Parameters
apiVersion
required - stringidentity
optionaltype
optional - stringThe identity type.
userAssignedIdentities
optional - undefineddictionary containing all the user assigned identities, with resourceId of the UAI as key.
location
optional - stringSpecifies the location of the resource.
name
required - stringName of the Azure Machine Learning compute.
properties
requiredcomputeLocation
optional - stringLocation for the underlying compute
description
optional - stringThe description of the Machine Learning compute.
disableLocalAuth
optional - booleanOpt-out of local authentication and ensure customers can use only MSI and AAD exclusively for authentication.
resourceId
optional - stringARM resource id of the underlying compute
sku
optionalname
optional - stringName of the sku
tier
optional - stringTier of the sku like Basic or Enterprise
systemData
optionalcreatedAt
optional - stringThe timestamp of resource creation (UTC).
createdBy
optional - stringThe identity that created the resource.
createdByType
optional - stringThe type of identity that created the resource.
lastModifiedAt
optional - stringThe timestamp of resource last modification (UTC)
lastModifiedBy
optional - stringThe identity that last modified the resource.
lastModifiedByType
optional - stringThe type of identity that last modified the resource.
tags
optional - stringContains resource tags defined as key/value pairs.
type
required - string