Azure Machine Learning Inference Cluster

This page shows how to write Terraform and Azure Resource Manager for Machine Learning Inference Cluster and write them securely.

azurerm_machine_learning_inference_cluster (Terraform)

The Inference Cluster in Machine Learning can be configured in Terraform with the resource name azurerm_machine_learning_inference_cluster. The following sections describe how to use the resource and its parameters.

Example Usage from GitHub

An example could not be found in GitHub.

Review your Terraform file for Azure best practices

Shisho Cloud, our free checker to make sure your Terraform configuration follows best practices, is available (beta).

Parameters

The following arguments are supported:

  • name - (Required) The name which should be used for this Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.

  • kubernetes_cluster_id - (Required) The ID of the Kubernetes Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.

  • location - (Required) The Azure Region where the Machine Learning Inference Cluster should exist. Changing this forces a new Machine Learning Inference Cluster to be created.

  • machine_learning_workspace_id - (Required) The ID of the Machine Learning Workspace. Changing this forces a new Machine Learning Inference Cluster to be created.


  • cluster_purpose - (Optional) The purpose of the Inference Cluster. Options are DevTest, DenseProd and FastProd. If used for Development or Testing, use DevTest here. Default purpose is FastProd, which is recommended for production workloads. Changing this forces a new Machine Learning Inference Cluster to be created.

NOTE: When creating or attaching a cluster, if the cluster will be used for production (cluster_purpose = "FastProd"), then it must contain at least 12 virtual CPUs. The number of virtual CPUs can be calculated by multiplying the number of nodes in the cluster by the number of cores provided by the VM size selected. For example, if you use a VM size of "Standard_D3_v2", which has 4 virtual cores, then you should select 3 or greater as the number of nodes.

  • description - (Optional) The description of the Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.

  • identity - (Optional) An identity block as defined below. Changing this forces a new Machine Learning Inference Cluster to be created.

  • ssl - (Optional) A ssl block as defined below. Changing this forces a new Machine Learning Inference Cluster to be created.

  • tags - (Optional) A mapping of tags which should be assigned to the Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.


A identity block supports the following:

  • type - (Required) The Type of Identity which should be used for this Machine Learning Inference Cluster. Possible values are SystemAssigned, UserAssigned and SystemAssigned,UserAssigned. Changing this forces a new Machine Learning Inference Cluster to be created.

  • identity_ids - (Optional) A list of User Managed Identity ID's which should be assigned to the Machine Learning Inference Cluster. Changing this forces a new Machine Learning Inference Cluster to be created.


A ssl block supports the following:

  • cert - (Optional) The certificate for the ssl configuration.Conflicts with ssl.0.leaf_domain_label,ssl.0.overwrite_existing_domain. Changing this forces a new Machine Learning Inference Cluster to be created.

  • cname - (Optional) The cname of the ssl configuration.Conflicts with ssl.0.leaf_domain_label,ssl.0.overwrite_existing_domain. Changing this forces a new Machine Learning Inference Cluster to be created.

  • key - (Optional) The key content for the ssl configuration.Conflicts with ssl.0.leaf_domain_label,ssl.0.overwrite_existing_domain. Changing this forces a new Machine Learning Inference Cluster to be created.

  • leaf_domain_label - (Optional) The leaf domain label for the ssl configuration. Conflicts with ssl.0.cert,ssl.0.key,ssl.0.cname. Changing this forces a new Machine Learning Inference Cluster to be created.

  • overwrite_existing_domain - (Optional) Whether or not to overwrite existing leaf domain. Conflicts with ssl.0.cert,ssl.0.key,ssl.0.cname Changing this forces a new Machine Learning Inference Cluster to be created.

In addition to the Arguments listed above - the following Attributes are exported:

  • id - The ID of the Machine Learning Inference Cluster.

  • identity - An identity block as defined below, which contains the Managed Service Identity information for this Machine Learning Inference Cluster.


A identity block exports the following:

  • principal_id - The Principal ID for the Service Principal associated with the Managed Service Identity of this Machine Learning Inference Cluster.

  • tenant_id - The Tenant ID for the Service Principal associated with the Managed Service Identity of this Machine Learning Inference Cluster.

Explanation in Terraform Registry

Manages a Machine Learning Inference Cluster.

NOTE: The Machine Learning Inference Cluster resource is used to attach an existing AKS cluster to the Machine Learning Workspace, it doesn't create the AKS cluster itself. Therefore it can only be created and deleted, not updated. Any change to the configuration will recreate the resource.

Microsoft.MachineLearningServices/workspaces/computes (Azure Resource Manager)

The workspaces/computes in Microsoft.MachineLearningServices can be configured in Azure Resource Manager with the resource name Microsoft.MachineLearningServices/workspaces/computes. The following sections describe how to use the resource and its parameters.

Example Usage from GitHub

An example could not be found in GitHub.

Parameters

  • apiVersion required - string
  • identity optional
      • type optional - string

        The identity type.

      • userAssignedIdentities optional - undefined

        dictionary containing all the user assigned identities, with resourceId of the UAI as key.

  • location optional - string

    Specifies the location of the resource.

  • name required - string

    Name of the Azure Machine Learning compute.

  • properties required
      • computeLocation optional - string

        Location for the underlying compute

      • description optional - string

        The description of the Machine Learning compute.

      • disableLocalAuth optional - boolean

        Opt-out of local authentication and ensure customers can use only MSI and AAD exclusively for authentication.

      • resourceId optional - string

        ARM resource id of the underlying compute

  • sku optional
      • name optional - string

        Name of the sku

      • tier optional - string

        Tier of the sku like Basic or Enterprise

  • systemData optional
      • createdAt optional - string

        The timestamp of resource creation (UTC).

      • createdBy optional - string

        The identity that created the resource.

      • createdByType optional - string

        The type of identity that created the resource.

      • lastModifiedAt optional - string

        The timestamp of resource last modification (UTC)

      • lastModifiedBy optional - string

        The identity that last modified the resource.

      • lastModifiedByType optional - string

        The type of identity that last modified the resource.

  • tags optional - string

    Contains resource tags defined as key/value pairs.

  • type required - string

Frequently asked questions

What is Azure Machine Learning Inference Cluster?

Azure Machine Learning Inference Cluster is a resource for Machine Learning of Microsoft Azure. Settings can be wrote in Terraform.