nano SIEM
Integrations

GCP Pub/Sub

End-to-end guide for ingesting Google Cloud logs via Pub/Sub subscriptions

GCP Pub/Sub

This guide walks through ingesting Google Cloud logs into nano using Pub/Sub — Cloud Audit Logs, VPC Flow Logs, Security Command Center findings, GKE logs, and any other log type exportable via Cloud Logging.

nano connects to a Pub/Sub pull subscription and consumes messages in batches. Google Cloud's Log Router exports logs to a Pub/Sub topic, and nano reads from a subscription on that topic.

Prerequisites

  • A Google Cloud project with billing enabled
  • gcloud CLI installed and authenticated (gcloud auth login)
  • Permissions to create service accounts, Pub/Sub resources, and log sinks (typically roles/owner or a combination of roles/iam.serviceAccountAdmin, roles/pubsub.admin, roles/logging.admin)
  • A running nano instance

Step 1: Create a Pub/Sub Topic and Subscription

Create a dedicated topic for the log type you want to ingest, and a pull subscription that nano will read from.

# Set your project
export PROJECT_ID="your-gcp-project-id"
gcloud config set project $PROJECT_ID

# Create the topic
gcloud pubsub topics create nanosiem-audit-logs

# Create a pull subscription
gcloud pubsub subscriptions create nanosiem-audit-logs-sub \
  --topic=nanosiem-audit-logs \
  --ack-deadline=600 \
  --message-retention-duration=1d \
  --expiration-period=never

Naming Convention

Use descriptive names that match your nano log source names:

Log TypeTopic NameSubscription Name
Cloud Audit Logsnanosiem-audit-logsnanosiem-audit-logs-sub
VPC Flow Logsnanosiem-vpc-flowlogsnanosiem-vpc-flowlogs-sub
SCC Findingsnanosiem-scc-findingsnanosiem-scc-findings-sub
GKE Logsnanosiem-gke-logsnanosiem-gke-logs-sub

Step 2: Create a Log Router Sink

A Log Router sink exports matching log entries from Cloud Logging to your Pub/Sub topic. The sink filter determines which logs are exported.

Cloud Audit Logs (Admin Activity + Data Access)

gcloud logging sinks create nanosiem-audit-sink \
  pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-audit-logs \
  --log-filter='logName:"cloudaudit.googleapis.com"' \
  --description="Export audit logs to nano"

VPC Flow Logs

gcloud logging sinks create nanosiem-vpcflow-sink \
  pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-vpc-flowlogs \
  --log-filter='resource.type="gce_subnetwork" AND logName:"compute.googleapis.com%2Fvpc_flows"' \
  --description="Export VPC Flow Logs to nano"

Security Command Center Findings

gcloud logging sinks create nanosiem-scc-sink \
  pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-scc-findings \
  --log-filter='resource.type="threat_detector" OR resource.type="security_command_center"' \
  --description="Export SCC findings to nano"

GKE Audit Logs

gcloud logging sinks create nanosiem-gke-sink \
  pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-gke-logs \
  --log-filter='resource.type="k8s_cluster" AND logName:"cloudaudit.googleapis.com"' \
  --description="Export GKE audit logs to nano"

Custom Filter (All Security-Relevant Logs)

To export multiple log types to a single topic:

gcloud logging sinks create nanosiem-all-security \
  pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-security-logs \
  --log-filter='
    logName:"cloudaudit.googleapis.com"
    OR (resource.type="gce_subnetwork" AND logName:"compute.googleapis.com%2Fvpc_flows")
    OR resource.type="threat_detector"
  ' \
  --description="Export all security-relevant logs to nano"

Grant the Sink's Service Account Publish Access

When you create a log sink, GCP creates a dedicated writer service account. This account needs permission to publish to your Pub/Sub topic.

# Get the sink's writer identity
SINK_SA=$(gcloud logging sinks describe nanosiem-audit-sink \
  --format='value(writerIdentity)')
echo "Sink service account: $SINK_SA"

# Grant it the Pub/Sub Publisher role on the topic
gcloud pubsub topics add-iam-policy-binding nanosiem-audit-logs \
  --member="$SINK_SA" \
  --role="roles/pubsub.publisher"

Verify the Sink Is Working

# Check sink status (should show no errors)
gcloud logging sinks describe nanosiem-audit-sink

# Wait a minute, then check if messages are arriving
gcloud pubsub subscriptions pull nanosiem-audit-logs-sub \
  --limit=1 \
  --auto-ack=false

If no messages appear after a few minutes, generate some activity (e.g., list IAM policies: gcloud iam roles list --limit=1) and check again. Admin Activity audit logs are generated immediately; Data Access logs may have a slight delay.

Step 3: Create a Service Account for nano

Create a dedicated service account with minimal permissions for nano to pull from the subscription.

# Create the service account
gcloud iam service-accounts create nanosiem-reader \
  --display-name="nano Log Reader" \
  --description="Service account for nano to pull logs from Pub/Sub"

# Get the full service account email
SA_EMAIL="nanosiem-reader@${PROJECT_ID}.iam.gserviceaccount.com"
echo "Service account: $SA_EMAIL"

Grant Permissions

nano needs two permissions on the subscription: consume messages and acknowledge them.

# Grant Pub/Sub Subscriber role on the subscription
gcloud pubsub subscriptions add-iam-policy-binding nanosiem-audit-logs-sub \
  --member="serviceAccount:${SA_EMAIL}" \
  --role="roles/pubsub.subscriber"

The roles/pubsub.subscriber role includes:

  • pubsub.subscriptions.consume — pull messages
  • pubsub.subscriptions.get — check subscription status

Create and Download the Key

# Create a JSON key file
gcloud iam service-accounts keys create /tmp/nanosiem-reader-key.json \
  --iam-account="$SA_EMAIL"

echo "Key saved to /tmp/nanosiem-reader-key.json"

The key file will look like this (you'll paste the entire contents into nano):

{
  "type": "service_account",
  "project_id": "your-project-id",
  "private_key_id": "key-id",
  "private_key": "-----BEGIN RSA PRIVATE KEY-----\n...\n-----END RSA PRIVATE KEY-----\n",
  "client_email": "nanosiem-reader@your-project.iam.gserviceaccount.com",
  "client_id": "123456789",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/..."
}

Step 4: Store Credentials in nano

  1. Navigate to SettingsCloud Credentials
  2. Click Add Credential
  3. Fill in the form:
FieldValue
ProviderGCP Pub/Sub
NameA descriptive name, e.g. GCP Production - Audit Logs
Service Account JSONPaste the entire contents of the JSON key file
  1. Click Save

The credential is encrypted at rest. The private key will never be displayed again.

# After uploading, delete the local key file
rm /tmp/nanosiem-reader-key.json

Step 5: Create a Log Source

  1. Navigate to FeedsNew Feed (or use the Log Source Wizard)
  2. Select "I have sample logs" and paste a representative log entry (see examples below)
  3. The AI will detect the format and generate a VRL parser
  4. Configure the source connection:
FieldValue
Source TypeGCP Pub/Sub
GCP Project IDYour project ID, e.g. my-gcp-project-123
Subscription NameThe subscription name (not the full path), e.g. nanosiem-audit-logs-sub
ACK Deadline (seconds)600 (default, matches the subscription setting)
CredentialSelect the credential you created in Step 4
  1. Set the feed metadata (name, category, vendor, product)
  2. Publish the parser to create a version and deploy to Vector

Sample Log Events by GCP Service

Cloud Audit Log (Admin Activity)

{
  "protoPayload": {
    "@type": "type.googleapis.com/google.cloud.audit.AuditLog",
    "serviceName": "iam.googleapis.com",
    "methodName": "google.iam.admin.v1.CreateServiceAccount",
    "authenticationInfo": {
      "principalEmail": "alice@example.com"
    },
    "requestMetadata": {
      "callerIp": "203.0.113.50",
      "callerSuppliedUserAgent": "google-cloud-sdk gcloud/456.0.0"
    },
    "resourceName": "projects/my-project/serviceAccounts/new-sa@my-project.iam.gserviceaccount.com",
    "request": {
      "account_id": "new-sa",
      "service_account": {
        "display_name": "New Service Account"
      }
    }
  },
  "insertId": "abc123def456",
  "resource": {
    "type": "service_account",
    "labels": {
      "project_id": "my-project",
      "email_id": "new-sa@my-project.iam.gserviceaccount.com"
    }
  },
  "timestamp": "2025-01-15T14:23:45.678Z",
  "severity": "NOTICE",
  "logName": "projects/my-project/logs/cloudaudit.googleapis.com%2Factivity"
}

VPC Flow Log

{
  "insertId": "1a2b3c4d",
  "jsonPayload": {
    "connection": {
      "src_ip": "10.128.0.5",
      "src_port": 49152,
      "dest_ip": "203.0.113.50",
      "dest_port": 443,
      "protocol": 6
    },
    "src_instance": {
      "project_id": "my-project",
      "zone": "us-central1-a",
      "vm_name": "web-server-1"
    },
    "bytes_sent": "5040",
    "packets_sent": "25",
    "start_time": "2025-01-15T14:23:00Z",
    "end_time": "2025-01-15T14:24:00Z",
    "reporter": "SRC"
  },
  "resource": {
    "type": "gce_subnetwork",
    "labels": {
      "project_id": "my-project",
      "subnetwork_name": "default",
      "location": "us-central1-a"
    }
  },
  "timestamp": "2025-01-15T14:24:05.123Z",
  "logName": "projects/my-project/logs/compute.googleapis.com%2Fvpc_flows"
}

Security Command Center Finding

{
  "finding": {
    "name": "organizations/123/sources/456/findings/abc123",
    "parent": "organizations/123/sources/456",
    "state": "ACTIVE",
    "category": "PUBLIC_BUCKET_ACL",
    "resourceName": "//storage.googleapis.com/my-public-bucket",
    "severity": "HIGH",
    "eventTime": "2025-01-15T14:23:45Z",
    "createTime": "2025-01-15T14:23:45Z",
    "sourceProperties": {
      "ResourcePath": "my-public-bucket",
      "AllowedPolicy": "allUsers"
    }
  },
  "resource": {
    "type": "google.cloud.storage.Bucket",
    "name": "//storage.googleapis.com/my-public-bucket",
    "project": "my-project"
  }
}

Step 6: Verify Ingestion

After publishing, allow a few minutes for Vector to connect to the subscription and start pulling messages.

Check Feed Health

  1. Go to Feeds → select your new log source
  2. On the Overview tab, check:
    • Status: Should show "Healthy"
    • Event Volume chart: Should show events arriving
    • Last Event: Should show a recent timestamp

Search Your Data

Navigate to Search and run a query for your source type:

source_type="gcp_audit"

Or search for specific activity:

source_type="gcp_audit" methodName="CreateServiceAccount"
| table timestamp, user, src_ip, methodName, resourceName

Check for Errors

If no data appears:

  1. Check the subscription backlog — Are messages accumulating?

    gcloud pubsub subscriptions describe nanosiem-audit-logs-sub \
      --format='value(numUndeliveredMessages)'
    • Messages accumulating: nano isn't pulling. Check credentials and the log source deployment status.
    • Zero messages: The Log Router sink isn't exporting, or no matching logs are being generated. Check the sink configuration and generate some test activity.
  2. Verify the sink is delivering:

    # Check for sink errors
    gcloud logging sinks describe nanosiem-audit-sink \
      --format='yaml(writerIdentity,destination,filter,outputVersionFormat)'
  3. Check Vector logs for connection errors:

    docker logs nanosiem-vector 2>&1 | grep -i "gcp\|pubsub\|error"
  4. Check ingestion errors in nano at SystemIngestion Errors

Multi-Project Setup

For organizations with multiple GCP projects, you have two approaches:

Create a single sink at the organization or folder level that captures logs from all projects:

# Organization-level sink (requires Org Admin)
gcloud logging sinks create nanosiem-org-audit-sink \
  pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-audit-logs \
  --organization=ORG_ID \
  --log-filter='logName:"cloudaudit.googleapis.com"' \
  --include-children \
  --description="Export all org audit logs to nano"

# Grant the sink's writer identity publish access
SINK_SA=$(gcloud logging sinks describe nanosiem-org-audit-sink \
  --organization=ORG_ID \
  --format='value(writerIdentity)')

gcloud pubsub topics add-iam-policy-binding nanosiem-audit-logs \
  --member="$SINK_SA" \
  --role="roles/pubsub.publisher"

Option B: Per-Project Sinks

Create individual sinks in each project, all publishing to the same topic. More granular control but more to manage.

Troubleshooting

"Permission denied" in Vector logs

  • Verify the service account has roles/pubsub.subscriber on the subscription (not the topic)
  • Check the service account JSON was pasted correctly (it should start with { and end with })
  • If using Workload Identity, verify the Kubernetes service account is annotated correctly

Messages in subscription but no events in nano

  • Check the log source is deployed (published) — look for "Unpublished changes" banner
  • Verify the Subscription Name in the log source config is just the name (e.g., nanosiem-audit-logs-sub), not the full resource path
  • Verify the Project ID matches the project where the subscription lives
  • Look for parse errors in SystemIngestion Errors

Sink shows errors in Cloud Logging

  • Check the writer identity has roles/pubsub.publisher on the topic
  • Verify the topic exists and hasn't been deleted
  • Check the Pub/Sub API is enabled: gcloud services enable pubsub.googleapis.com

High latency or message redelivery

  • Increase the ACK deadline on both the subscription and in nano's log source config (both should match)
  • Check Vector resource allocation — Pub/Sub processing may need more CPU/memory for high-volume subscriptions
  • Consider multiple subscriptions to the same topic for parallel consumption

Subscription expiration

By default, Pub/Sub subscriptions expire after 31 days of inactivity. Use --expiration-period=never when creating subscriptions (as shown in Step 1) to prevent this. For existing subscriptions:

gcloud pubsub subscriptions update nanosiem-audit-logs-sub \
  --expiration-period=never

If you're starting from scratch, prioritize these GCP log sources:

PrioritySourceLog FilterWhy
1Admin Activity Audit LogslogName:"activity"All administrative actions — IAM changes, resource creation, config modifications
2Security Command Centerresource.type="threat_detector"GCP-native threat detection and misconfiguration findings
3Data Access Audit LogslogName:"data_access"Who read/listed/exported what data (high volume — filter carefully)
4VPC Flow Logsresource.type="gce_subnetwork"Network traffic for lateral movement and exfiltration detection
5GKE Audit Logsresource.type="k8s_cluster"Kubernetes API activity for container security monitoring
6Cloud DNS Logsresource.type="dns_query"DNS queries for C2 and exfiltration detection

Next Steps

On this page

On this page