Azure Data Lake Analytics Account

This page shows how to write Terraform and Azure Resource Manager for Data Lake Analytics Account and write them securely.

azurerm_data_lake_analytics_account (Terraform)

The Analytics Account in Data Lake can be configured in Terraform with the resource name azurerm_data_lake_analytics_account. The following sections describe 2 examples of how to use the resource and its parameters.

Example Usage from GitHub

main.tf#L7
resource "azurerm_data_lake_analytics_account" "this" {
  default_store_account_name = var.default_store_account_name
  location                   = var.location
  name                       = var.name
  resource_group_name        = var.resource_group_name
  tags                       = var.tags
main.tf#L7
resource "azurerm_data_lake_analytics_account" "this" {
  default_store_account_name = var.default_store_account_name
  location                   = var.location
  name                       = var.name
  resource_group_name        = var.resource_group_name
  tags                       = var.tags

Review your Terraform file for Azure best practices

Shisho Cloud, our free checker to make sure your Terraform configuration follows best practices, is available (beta).

Parameters

Explanation in Terraform Registry

Manages an Azure Data Lake Analytics Account.

Tips: Best Practices for The Other Azure Data Lake Resources

In addition to the azurerm_data_lake_store, Azure Data Lake has the other resources that should be configured for security reasons. Please check some examples of those resources and precautions.

risk-label

azurerm_data_lake_store

Ensure to enable the encryption of data lake storage

It is better to enable the encryption of Data Lake storage.

Review your Azure Data Lake settings

In addition to the above, there are other security points you should be aware of making sure that your .tf files are protected in Shisho Cloud.

Microsoft.DataLakeAnalytics/accounts (Azure Resource Manager)

The accounts in Microsoft.DataLakeAnalytics can be configured in Azure Resource Manager with the resource name Microsoft.DataLakeAnalytics/accounts. The following sections describe how to use the resource and its parameters.

Example Usage from GitHub

An example could not be found in GitHub.

Parameters

  • apiVersion required - string
  • location required - string

    The resource location.

  • name required - string

    The name of the Data Lake Analytics account.

  • properties required
      • computePolicies optional array
          • name required - string

            The unique name of the compute policy to create.

          • properties required
              • maxDegreeOfParallelismPerJob optional - integer

                The maximum degree of parallelism per job this user can use to submit jobs. This property, the min priority per job property, or both must be passed.

              • minPriorityPerJob optional - integer

                The minimum priority per job this user can use to submit jobs. This property, the max degree of parallelism per job property, or both must be passed.

              • objectId required - string

                The AAD object identifier for the entity to create a policy for.

              • objectType required - string

                The type of AAD object the object identifier refers to.

      • dataLakeStoreAccounts required array
          • name required - string

            The unique name of the Data Lake Store account to add.

          • properties optional
              • suffix optional - string

                The optional suffix for the Data Lake Store account.

      • defaultDataLakeStoreAccount required - string

        The default Data Lake Store account associated with this account.

      • firewallAllowAzureIps optional - string

        The current state of allowing or disallowing IPs originating within Azure through the firewall. If the firewall is disabled, this is not enforced.

      • firewallRules optional array
          • name required - string

            The unique name of the firewall rule to create.

          • properties required
              • endIpAddress required - string

                The end IP address for the firewall rule. This can be either ipv4 or ipv6. Start and End should be in the same protocol.

              • startIpAddress required - string

                The start IP address for the firewall rule. This can be either ipv4 or ipv6. Start and End should be in the same protocol.

      • firewallState optional - string

        The current state of the IP address firewall for this account.

      • maxDegreeOfParallelism optional - integer

        The maximum supported degree of parallelism for this account.

      • maxDegreeOfParallelismPerJob optional - integer

        The maximum supported degree of parallelism per job for this account.

      • maxJobCount optional - integer

        The maximum supported jobs running under the account at the same time.

      • minPriorityPerJob optional - integer

        The minimum supported priority per job for this account.

      • newTier optional - string

        The commitment tier for the next month.

      • queryStoreRetention optional - integer

        The number of days that job metadata is retained.

      • storageAccounts optional array
          • name required - string

            The unique name of the Azure Storage account to add.

          • properties required
              • accessKey required - string

                The access key associated with this Azure Storage account that will be used to connect to it.

              • suffix optional - string

                The optional suffix for the storage account.

  • tags optional - string

    The resource tags.

  • type required - string

Frequently asked questions

What is Azure Data Lake Analytics Account?

Azure Data Lake Analytics Account is a resource for Data Lake of Microsoft Azure. Settings can be wrote in Terraform.

Where can I find the example code for the Azure Data Lake Analytics Account?

For Terraform, the kevinhead/azurerm and niveklabs/azurerm source code examples are useful. See the Terraform Example section for further details.