Google BigQuery Data Transfer Config
This page shows how to write Terraform for BigQuery Data Transfer Config and write them securely.
google_bigquery_data_transfer_config (Terraform)
The Config in BigQuery Data Transfer can be configured in Terraform with the resource name google_bigquery_data_transfer_config
. The following sections describe 1 example of how to use the resource and its parameters.
Example Usage from GitHub
resource "google_bigquery_data_transfer_config" "this" {
data_refresh_window_days = var.data_refresh_window_days
data_source_id = var.data_source_id
destination_dataset_id = var.destination_dataset_id
disabled = var.disabled
display_name = var.display_name
Parameters
-
data_refresh_window_days
optional - number
The number of days to look back to automatically refresh the data. For example, if dataRefreshWindowDays = 10, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
-
data_source_id
required - string
The data source id. Cannot be changed once the transfer config is created.
-
destination_dataset_id
required - string
The BigQuery target dataset id.
-
disabled
optional - bool
When set to true, no runs are scheduled for a given transfer.
-
display_name
required - string
The user specified display name for the transfer config.
The geographic location where the transfer config should reside. Examples: US, EU, asia-northeast1. The default value is US.
-
name
optional computed - string
The resource name of the transfer config. Transfer config names have the form projects/[projectId]/locations/[location]/transferConfigs/[configId]. Where configId is usually a uuid, but this is not required. The name is ignored when creating a transfer config.
-
notification_pubsub_topic
optional - string
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish.
-
params
required - map from string to string
These parameters are specific to each data source.
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30, every wed,fri of jan, jun 13:15, and first sunday of quarter 00:00. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: the granularity should be at least 8 hours, or less frequent.
-
service_account_name
optional - string
Optional service account name. If this field is set, transfer config will be created with this service account credentials. It requires that requesting user calling this API has permissions to act as this service account.
-
email_preferences
list block-
enable_failure_email
required - bool
If true, email notifications will be sent on transfer run failures.
-
-
schedule_options
list block-
disable_auto_scheduling
optional - bool
If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using transferConfigs.startManualRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
-
end_time
optional - string
Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be triggered manually is not limited by this option.
-
start_time
optional - string
Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be triggered manually is not limited by this option.
-
-
sensitive_params
list block-
secret_access_key
required - string
The Secret Access Key of the AWS account transferring data from.
-
-
timeouts
single block
Explanation in Terraform Registry
Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. To get more information about Config, see:
- API documentation
- How-to Guides
- Official Documentation
Warning: All arguments including
sensitive_params.secret_access_key
will be stored in the raw state as plain-text. Read more about sensitive data in state.
Frequently asked questions
What is Google BigQuery Data Transfer Config?
Google BigQuery Data Transfer Config is a resource for BigQuery Data Transfer of Google Cloud Platform. Settings can be wrote in Terraform.
Where can I find the example code for the Google BigQuery Data Transfer Config?
For Terraform, the niveklabs/google source code example is useful. See the Terraform Example section for further details.