Google Cloud (Stackdriver) Logging Project Sink

This page shows how to write Terraform for Cloud (Stackdriver) Logging Project Sink and write them securely.

google_logging_project_sink (Terraform)

The Project Sink in Cloud (Stackdriver) Logging can be configured in Terraform with the resource name google_logging_project_sink. The following sections describe 3 examples of how to use the resource and its parameters.

Example Usage from GitHub
resource "google_logging_project_sink" "project_sink_good_1" {
  name = "my-pubsub-instance-sink"
  destination =
  filter = "resource.type = gce_instance AND severity >= WARNING"
  unique_writer_identity = true
resource "google_logging_project_sink" "project_sink_good_1" {
  name = "my-pubsub-instance-sink"
  destination =
  filter = "resource.type = gce_instance AND severity >= WARNING"
  unique_writer_identity = true
resource "google_logging_project_sink" "basic" {
  name = "my-pubsub-instance-sink"

  destination = "fake"

Review your Terraform file for Google best practices

Shisho Cloud, our free checker to make sure your Terraform configuration follows best practices, is available (beta).


A description of this sink. The maximum length of the description is 8000 characters.

The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, or a BigQuery dataset. Examples: "[GCS_BUCKET]" "[PROJECT_ID]/datasets/[DATASET]" "[PROJECT_ID]/topics/[TOPIC_ID]" The writer associated with the sink must have access to write to the above resource.

If set to True, then this sink is disabled and it does not export any log entries.

The filter to apply when exporting logs. Only log entries that match the filter are exported.

  • id optional computed - string
  • name required - string

The name of the logging sink.

The ID of the project to create the sink in. If omitted, the project associated with the provider is used.

Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is If true, then a unique service account is created and used for this sink. If you wish to publish logs across projects, you must set unique_writer_identity to true.

The identity associated with this sink. This identity must be granted write access to the configured destination.

  • bigquery_options list block

    Whether to use BigQuery's partition tables. By default, Logging creates dated tables based on the log entries' timestamps, e.g. syslog_20170523. With partitioned tables the date suffix is no longer present and special query syntax has to be used instead. In both cases, tables are sharded based on UTC timezone.

  • exclusions list block

    A description of this exclusion.

    If set to True, then this exclusion is disabled and it does not exclude any log entries

    An advanced logs filter that matches the log entries to be excluded. By using the sample function, you can exclude less than 100% of the matching log entries

    A client-assigned identifier, such as "load-balancer-exclusion". Identifiers are limited to 100 characters and can include only letters, digits, underscores, hyphens, and periods. First character has to be alphanumeric.

Explanation in Terraform Registry

Manages a project-level logging sink. For more information see:

  • API documentation
  • How-to Guides
    • Exporting Logs

      You can specify exclusions for log sinks created by terraform by using the exclusions field of google_logging_folder_sink

      Note: You must have granted the "Logs Configuration Writer" IAM role (roles/logging.configWriter) to the credentials used with terraform.

      Note You must enable the Cloud Resource Manager API resource "google_compute_instance" "my-logged-instance" { name = "my-instance" machine_type = "e2-medium" zone = "us-central1-a" boot_disk { initialize_params { image = "debian-cloud/debian-9" } } network_interface { network = "default" access_config { } } } resource "google_storage_bucket" "log-bucket" { name = "my-unique-logging-bucket" location = "US" } resource "google_logging_project_sink" "instance-sink" { name = "my-instance-sink" description = "some explanation on what this is" destination = "${}" filter = "resource.type = gce_instance AND resource.labels.instance_id = \"${}\"" unique_writer_identity = true } resource "google_project_iam_binding" "log-writer" { project = "your-project-id" role = "roles/storage.objectCreator" members = [ google_logging_project_sink.instance-sink.writer_identity, ] }

The following example uses `exclusions` to filter logs that will not be exported. In this example logs are exported to a [log bucket]( and there are 2 exclusions configured
resource "google_logging_project_sink" "log-bucket" {
  name        = "my-logging-sink"
  destination = ""
  exclusions {
        name = "nsexcllusion1"
        description = "Exclude logs from namespace-1 in k8s"
        filter = "resource.type = k8s_container resource.labels.namespace_name=\"namespace-1\" "
    exclusions {
        name = "nsexcllusion2"
        description = "Exclude logs from namespace-2 in k8s"
        filter = "resource.type = k8s_container resource.labels.namespace_name=\"namespace-2\" "
  unique_writer_identity = true

Frequently asked questions

What is Google Cloud (Stackdriver) Logging Project Sink?

Google Cloud (Stackdriver) Logging Project Sink is a resource for Cloud (Stackdriver) Logging of Google Cloud Platform. Settings can be wrote in Terraform.

Where can I find the example code for the Google Cloud (Stackdriver) Logging Project Sink?

For Terraform, the melscoop-test/check, SnidermanIndustries/checkov-fork and infracost/infracost source code examples are useful. See the Terraform Example section for further details.


Automate config file reviews on your commits

Fix issues in your infrastructure as code with auto-generated patches.