Google Dataflow Flex Template Job

This page shows how to write Terraform for Dataflow Flex Template Job and write them securely.

google_dataflow_flex_template_job (Terraform)

The Flex Template Job in Dataflow can be configured in Terraform with the resource name google_dataflow_flex_template_job. The following sections describe 1 example of how to use the resource and its parameters.

Example Usage from GitHub

dataflow.tf#L1
resource "google_dataflow_flex_template_job" "famflow" {
    provider = google-beta
    name              = "data"
    project = var.project_id
    region = var.region
    container_spec_gcs_path = "gs://dataflow-templates/2021-08-16-00_RC00/flex/Kafka_to_BigQuery"

Review your Terraform file for Google best practices

Shisho Cloud, our free checker to make sure your Terraform configuration follows best practices, is available (beta).

Parameters

The following arguments are supported:

  • name - (Required) A unique name for the resource, required by Dataflow.

  • container_spec_gcs_path - (Required) The GCS path to the Dataflow job Flex Template.


  • parameters - (Optional) Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

  • labels - (Optional) User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated in Terraform as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

  • on_delete - (Optional) One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy. See above note.

  • project - (Optional) The project in which the resource belongs. If it is not provided, the provider project is used.

  • region - (Optional) The region in which the created job should run.

In addition to the arguments listed above, the following computed attributes are exported:

  • job_id - The unique ID of this job.

  • state - The current state of the resource, selected from the JobState enum

Explanation in Terraform Registry

Creates a Flex Template job on Dataflow, which is an implementation of Apache Beam running on Google Compute Engine. For more information see the official documentation for Beam and Dataflow.

Frequently asked questions

What is Google Dataflow Flex Template Job?

Google Dataflow Flex Template Job is a resource for Dataflow of Google Cloud Platform. Settings can be wrote in Terraform.

Where can I find the example code for the Google Dataflow Flex Template Job?

For Terraform, the geewynn/family-api source code example is useful. See the Terraform Example section for further details.

security-icon

Automate config file reviews on your commits

Fix issues in your infrastructure as code with auto-generated patches.