Azure Data Factory Data Flow

This page shows how to write Terraform and Azure Resource Manager for Data Factory Data Flow and write them securely.

azurerm_data_factory_data_flow (Terraform)

The Data Flow in Data Factory can be configured in Terraform with the resource name azurerm_data_factory_data_flow. The following sections describe how to use the resource and its parameters.

Example Usage from GitHub

An example could not be found in GitHub.

Review your Terraform file for Azure best practices

Shisho Cloud, our free checker to make sure your Terraform configuration follows best practices, is available (beta).

Parameters

The following arguments are supported:

  • name - (Required) Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.

  • data_factory_id - (Required) The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.

  • script - (Required) The script for the Data Factory Data Flow.

  • source - (Required) One or more source blocks as defined below.

  • sink - (Required) One or more sink blocks as defined below.

  • annotations - (Optional) List of tags that can be used for describing the Data Factory Data Flow.

  • description - (Optional) The description for the Data Factory Data Flow.

  • folder - (Optional) The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.

  • transformation - (Optional) One or more transformation blocks as defined below.


A source block supports the following:

  • name - (Required) The name for the Data Flow Source.

  • description - (Optional) The description for the Data Flow Source.

  • dataset - (Optional) A dataset block as defined below.

  • linked_service - (Optional) A linked_service block as defined below.

  • schema_linked_service - (Optional) A schema_linked_service block as defined below.


A sink block supports the following:

  • name - (Required) The name for the Data Flow Source.

  • description - (Optional) The description for the Data Flow Source.

  • dataset - (Optional) A dataset block as defined below.

  • linked_service - (Optional) A linked_service block as defined below.

  • schema_linked_service - (Optional) A schema_linked_service block as defined below.


A dataset block supports the following:

  • name - (Required) The name for the Data Factory Dataset.

  • parameters - (Optional) A map of parameters to associate with the Data Factory dataset.


A linked_service block supports the following:

  • name - (Required) The name for the Data Factory Linked Service.

  • parameters - (Optional) A map of parameters to associate with the Data Factory Linked Service.


A schema_linked_service block supports the following:

  • name - (Required) The name for the Data Factory Linked Service with schema.

  • parameters - (Optional) A map of parameters to associate with the Data Factory Linked Service.


A transformation block supports the following:

  • name - (Required) The name for the Data Flow transformation.

  • description - (Optional) The description for the Data Flow transformation.

The following attributes are exported:

  • id - The ID of the Data Factory Data Flow.

Explanation in Terraform Registry

Manages a Data Flow inside an Azure Data Factory.

Tips: Best Practices for The Other Azure Data Factory Resources

In addition to the azurerm_data_factory, Azure Data Factory has the other resources that should be configured for security reasons. Please check some examples of those resources and precautions.

risk-label

azurerm_data_factory

Ensure to disable public access

It is better to disable public access for Data Factory, which is enabled as default.

Review your Azure Data Factory settings

In addition to the above, there are other security points you should be aware of making sure that your .tf files are protected in Shisho Cloud.

Microsoft.DataFactory/factories/dataflows (Azure Resource Manager)

The factories/dataflows in Microsoft.DataFactory can be configured in Azure Resource Manager with the resource name Microsoft.DataFactory/factories/dataflows. The following sections describe how to use the resource and its parameters.

Example Usage from GitHub

An example could not be found in GitHub.

Parameters

  • apiVersion required - string
  • name required - string

    The data flow name.

  • properties required
      • annotations optional - array

        List of tags that can be used for describing the data flow.

      • description optional - string

        The description of the data flow.

      • folder optional
          • name optional - string

            The name of the folder that this data flow is in.

  • type required - string

Frequently asked questions

What is Azure Data Factory Data Flow?

Azure Data Factory Data Flow is a resource for Data Factory of Microsoft Azure. Settings can be wrote in Terraform.