GCP Pub/Sub
End-to-end guide for ingesting Google Cloud logs via Pub/Sub subscriptions
GCP Pub/Sub
This guide walks through ingesting Google Cloud logs into nano using Pub/Sub — Cloud Audit Logs, VPC Flow Logs, Security Command Center findings, GKE logs, and any other log type exportable via Cloud Logging.
nano connects to a Pub/Sub pull subscription and consumes messages in batches. Google Cloud's Log Router exports logs to a Pub/Sub topic, and nano reads from a subscription on that topic.
Prerequisites
- A Google Cloud project with billing enabled
gcloudCLI installed and authenticated (gcloud auth login)- Permissions to create service accounts, Pub/Sub resources, and log sinks (typically
roles/owneror a combination ofroles/iam.serviceAccountAdmin,roles/pubsub.admin,roles/logging.admin) - A running nano instance
Step 1: Create a Pub/Sub Topic and Subscription
Create a dedicated topic for the log type you want to ingest, and a pull subscription that nano will read from.
# Set your project
export PROJECT_ID="your-gcp-project-id"
gcloud config set project $PROJECT_ID
# Create the topic
gcloud pubsub topics create nanosiem-audit-logs
# Create a pull subscription
gcloud pubsub subscriptions create nanosiem-audit-logs-sub \
--topic=nanosiem-audit-logs \
--ack-deadline=600 \
--message-retention-duration=1d \
--expiration-period=neverSubscription, not topic: nano connects to the subscription, not the topic directly. Vector uses the Pub/Sub pull API to fetch and acknowledge messages. The --ack-deadline=600 (10 minutes) gives Vector time to process batches before messages are redelivered.
Naming Convention
Use descriptive names that match your nano log source names:
| Log Type | Topic Name | Subscription Name |
|---|---|---|
| Cloud Audit Logs | nanosiem-audit-logs | nanosiem-audit-logs-sub |
| VPC Flow Logs | nanosiem-vpc-flowlogs | nanosiem-vpc-flowlogs-sub |
| SCC Findings | nanosiem-scc-findings | nanosiem-scc-findings-sub |
| GKE Logs | nanosiem-gke-logs | nanosiem-gke-logs-sub |
Step 2: Create a Log Router Sink
A Log Router sink exports matching log entries from Cloud Logging to your Pub/Sub topic. The sink filter determines which logs are exported.
Cloud Audit Logs (Admin Activity + Data Access)
gcloud logging sinks create nanosiem-audit-sink \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-audit-logs \
--log-filter='logName:"cloudaudit.googleapis.com"' \
--description="Export audit logs to nano"VPC Flow Logs
gcloud logging sinks create nanosiem-vpcflow-sink \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-vpc-flowlogs \
--log-filter='resource.type="gce_subnetwork" AND logName:"compute.googleapis.com%2Fvpc_flows"' \
--description="Export VPC Flow Logs to nano"Security Command Center Findings
gcloud logging sinks create nanosiem-scc-sink \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-scc-findings \
--log-filter='resource.type="threat_detector" OR resource.type="security_command_center"' \
--description="Export SCC findings to nano"GKE Audit Logs
gcloud logging sinks create nanosiem-gke-sink \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-gke-logs \
--log-filter='resource.type="k8s_cluster" AND logName:"cloudaudit.googleapis.com"' \
--description="Export GKE audit logs to nano"Custom Filter (All Security-Relevant Logs)
To export multiple log types to a single topic:
gcloud logging sinks create nanosiem-all-security \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-security-logs \
--log-filter='
logName:"cloudaudit.googleapis.com"
OR (resource.type="gce_subnetwork" AND logName:"compute.googleapis.com%2Fvpc_flows")
OR resource.type="threat_detector"
' \
--description="Export all security-relevant logs to nano"Log volume: Be thoughtful with your filter. Data Access audit logs and VPC Flow Logs can be extremely high volume. Start with Admin Activity audit logs and expand from there. You can check current log volume in Cloud Logging → Logs Explorer → "Log volume" chart.
Grant the Sink's Service Account Publish Access
When you create a log sink, GCP creates a dedicated writer service account. This account needs permission to publish to your Pub/Sub topic.
# Get the sink's writer identity
SINK_SA=$(gcloud logging sinks describe nanosiem-audit-sink \
--format='value(writerIdentity)')
echo "Sink service account: $SINK_SA"
# Grant it the Pub/Sub Publisher role on the topic
gcloud pubsub topics add-iam-policy-binding nanosiem-audit-logs \
--member="$SINK_SA" \
--role="roles/pubsub.publisher"Do this for every sink/topic pair. Each sink has its own writer identity, and each topic needs an explicit IAM binding. Without this, the sink will show errors in Cloud Logging and no messages will arrive in Pub/Sub.
Verify the Sink Is Working
# Check sink status (should show no errors)
gcloud logging sinks describe nanosiem-audit-sink
# Wait a minute, then check if messages are arriving
gcloud pubsub subscriptions pull nanosiem-audit-logs-sub \
--limit=1 \
--auto-ack=falseIf no messages appear after a few minutes, generate some activity (e.g., list IAM policies: gcloud iam roles list --limit=1) and check again. Admin Activity audit logs are generated immediately; Data Access logs may have a slight delay.
Step 3: Create a Service Account for nano
Create a dedicated service account with minimal permissions for nano to pull from the subscription.
# Create the service account
gcloud iam service-accounts create nanosiem-reader \
--display-name="nano Log Reader" \
--description="Service account for nano to pull logs from Pub/Sub"
# Get the full service account email
SA_EMAIL="nanosiem-reader@${PROJECT_ID}.iam.gserviceaccount.com"
echo "Service account: $SA_EMAIL"Grant Permissions
nano needs two permissions on the subscription: consume messages and acknowledge them.
# Grant Pub/Sub Subscriber role on the subscription
gcloud pubsub subscriptions add-iam-policy-binding nanosiem-audit-logs-sub \
--member="serviceAccount:${SA_EMAIL}" \
--role="roles/pubsub.subscriber"The roles/pubsub.subscriber role includes:
pubsub.subscriptions.consume— pull messagespubsub.subscriptions.get— check subscription status
Least privilege: Only grant the Subscriber role on the specific subscription(s) nano needs to read. Do not grant project-wide Pub/Sub access. If you have multiple subscriptions, repeat the add-iam-policy-binding command for each one.
Create and Download the Key
# Create a JSON key file
gcloud iam service-accounts keys create /tmp/nanosiem-reader-key.json \
--iam-account="$SA_EMAIL"
echo "Key saved to /tmp/nanosiem-reader-key.json"The key file will look like this (you'll paste the entire contents into nano):
{
"type": "service_account",
"project_id": "your-project-id",
"private_key_id": "key-id",
"private_key": "-----BEGIN RSA PRIVATE KEY-----\n...\n-----END RSA PRIVATE KEY-----\n",
"client_email": "nanosiem-reader@your-project.iam.gserviceaccount.com",
"client_id": "123456789",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/..."
}Treat this key like a password. Delete the local copy after uploading to nano. Consider rotating keys periodically. If nano runs on GKE in the same project, you can skip the key file and use Workload Identity instead — nano will use Application Default Credentials automatically.
Step 4: Store Credentials in nano
- Navigate to Settings → Cloud Credentials
- Click Add Credential
- Fill in the form:
| Field | Value |
|---|---|
| Provider | GCP Pub/Sub |
| Name | A descriptive name, e.g. GCP Production - Audit Logs |
| Service Account JSON | Paste the entire contents of the JSON key file |
- Click Save
The credential is encrypted at rest. The private key will never be displayed again.
# After uploading, delete the local key file
rm /tmp/nanosiem-reader-key.jsonWorkload Identity / ADC fallback: If nano runs on GCP infrastructure (GKE, GCE, Cloud Run) with Workload Identity configured, you can select "None (use environment/IAM)" instead of uploading a key. Vector will use Application Default Credentials to authenticate.
Step 5: Create a Log Source
- Navigate to Feeds → New Feed (or use the Log Source Wizard)
- Select "I have sample logs" and paste a representative log entry (see examples below)
- The AI will detect the format and generate a VRL parser
- Configure the source connection:
| Field | Value |
|---|---|
| Source Type | GCP Pub/Sub |
| GCP Project ID | Your project ID, e.g. my-gcp-project-123 |
| Subscription Name | The subscription name (not the full path), e.g. nanosiem-audit-logs-sub |
| ACK Deadline (seconds) | 600 (default, matches the subscription setting) |
| Credential | Select the credential you created in Step 4 |
- Set the feed metadata (name, category, vendor, product)
- Publish the parser to create a version and deploy to Vector
Sample Log Events by GCP Service
Cloud Audit Log (Admin Activity)
{
"protoPayload": {
"@type": "type.googleapis.com/google.cloud.audit.AuditLog",
"serviceName": "iam.googleapis.com",
"methodName": "google.iam.admin.v1.CreateServiceAccount",
"authenticationInfo": {
"principalEmail": "alice@example.com"
},
"requestMetadata": {
"callerIp": "203.0.113.50",
"callerSuppliedUserAgent": "google-cloud-sdk gcloud/456.0.0"
},
"resourceName": "projects/my-project/serviceAccounts/new-sa@my-project.iam.gserviceaccount.com",
"request": {
"account_id": "new-sa",
"service_account": {
"display_name": "New Service Account"
}
}
},
"insertId": "abc123def456",
"resource": {
"type": "service_account",
"labels": {
"project_id": "my-project",
"email_id": "new-sa@my-project.iam.gserviceaccount.com"
}
},
"timestamp": "2025-01-15T14:23:45.678Z",
"severity": "NOTICE",
"logName": "projects/my-project/logs/cloudaudit.googleapis.com%2Factivity"
}VPC Flow Log
{
"insertId": "1a2b3c4d",
"jsonPayload": {
"connection": {
"src_ip": "10.128.0.5",
"src_port": 49152,
"dest_ip": "203.0.113.50",
"dest_port": 443,
"protocol": 6
},
"src_instance": {
"project_id": "my-project",
"zone": "us-central1-a",
"vm_name": "web-server-1"
},
"bytes_sent": "5040",
"packets_sent": "25",
"start_time": "2025-01-15T14:23:00Z",
"end_time": "2025-01-15T14:24:00Z",
"reporter": "SRC"
},
"resource": {
"type": "gce_subnetwork",
"labels": {
"project_id": "my-project",
"subnetwork_name": "default",
"location": "us-central1-a"
}
},
"timestamp": "2025-01-15T14:24:05.123Z",
"logName": "projects/my-project/logs/compute.googleapis.com%2Fvpc_flows"
}Security Command Center Finding
{
"finding": {
"name": "organizations/123/sources/456/findings/abc123",
"parent": "organizations/123/sources/456",
"state": "ACTIVE",
"category": "PUBLIC_BUCKET_ACL",
"resourceName": "//storage.googleapis.com/my-public-bucket",
"severity": "HIGH",
"eventTime": "2025-01-15T14:23:45Z",
"createTime": "2025-01-15T14:23:45Z",
"sourceProperties": {
"ResourcePath": "my-public-bucket",
"AllowedPolicy": "allUsers"
}
},
"resource": {
"type": "google.cloud.storage.Bucket",
"name": "//storage.googleapis.com/my-public-bucket",
"project": "my-project"
}
}Step 6: Verify Ingestion
After publishing, allow a few minutes for Vector to connect to the subscription and start pulling messages.
Check Feed Health
- Go to Feeds → select your new log source
- On the Overview tab, check:
- Status: Should show "Healthy"
- Event Volume chart: Should show events arriving
- Last Event: Should show a recent timestamp
Search Your Data
Navigate to Search and run a query for your source type:
source_type="gcp_audit"Or search for specific activity:
source_type="gcp_audit" methodName="CreateServiceAccount"
| table timestamp, user, src_ip, methodName, resourceNameCheck for Errors
If no data appears:
-
Check the subscription backlog — Are messages accumulating?
gcloud pubsub subscriptions describe nanosiem-audit-logs-sub \ --format='value(numUndeliveredMessages)'- Messages accumulating: nano isn't pulling. Check credentials and the log source deployment status.
- Zero messages: The Log Router sink isn't exporting, or no matching logs are being generated. Check the sink configuration and generate some test activity.
-
Verify the sink is delivering:
# Check for sink errors gcloud logging sinks describe nanosiem-audit-sink \ --format='yaml(writerIdentity,destination,filter,outputVersionFormat)' -
Check Vector logs for connection errors:
docker logs nanosiem-vector 2>&1 | grep -i "gcp\|pubsub\|error" -
Check ingestion errors in nano at System → Ingestion Errors
Multi-Project Setup
For organizations with multiple GCP projects, you have two approaches:
Option A: Organization-Level Sink (Recommended)
Create a single sink at the organization or folder level that captures logs from all projects:
# Organization-level sink (requires Org Admin)
gcloud logging sinks create nanosiem-org-audit-sink \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/nanosiem-audit-logs \
--organization=ORG_ID \
--log-filter='logName:"cloudaudit.googleapis.com"' \
--include-children \
--description="Export all org audit logs to nano"
# Grant the sink's writer identity publish access
SINK_SA=$(gcloud logging sinks describe nanosiem-org-audit-sink \
--organization=ORG_ID \
--format='value(writerIdentity)')
gcloud pubsub topics add-iam-policy-binding nanosiem-audit-logs \
--member="$SINK_SA" \
--role="roles/pubsub.publisher"Option B: Per-Project Sinks
Create individual sinks in each project, all publishing to the same topic. More granular control but more to manage.
Troubleshooting
"Permission denied" in Vector logs
- Verify the service account has
roles/pubsub.subscriberon the subscription (not the topic) - Check the service account JSON was pasted correctly (it should start with
{and end with}) - If using Workload Identity, verify the Kubernetes service account is annotated correctly
Messages in subscription but no events in nano
- Check the log source is deployed (published) — look for "Unpublished changes" banner
- Verify the Subscription Name in the log source config is just the name (e.g.,
nanosiem-audit-logs-sub), not the full resource path - Verify the Project ID matches the project where the subscription lives
- Look for parse errors in System → Ingestion Errors
Sink shows errors in Cloud Logging
- Check the writer identity has
roles/pubsub.publisheron the topic - Verify the topic exists and hasn't been deleted
- Check the Pub/Sub API is enabled:
gcloud services enable pubsub.googleapis.com
High latency or message redelivery
- Increase the ACK deadline on both the subscription and in nano's log source config (both should match)
- Check Vector resource allocation — Pub/Sub processing may need more CPU/memory for high-volume subscriptions
- Consider multiple subscriptions to the same topic for parallel consumption
Subscription expiration
By default, Pub/Sub subscriptions expire after 31 days of inactivity. Use --expiration-period=never when creating subscriptions (as shown in Step 1) to prevent this. For existing subscriptions:
gcloud pubsub subscriptions update nanosiem-audit-logs-sub \
--expiration-period=neverRecommended Log Sources for Security
If you're starting from scratch, prioritize these GCP log sources:
| Priority | Source | Log Filter | Why |
|---|---|---|---|
| 1 | Admin Activity Audit Logs | logName:"activity" | All administrative actions — IAM changes, resource creation, config modifications |
| 2 | Security Command Center | resource.type="threat_detector" | GCP-native threat detection and misconfiguration findings |
| 3 | Data Access Audit Logs | logName:"data_access" | Who read/listed/exported what data (high volume — filter carefully) |
| 4 | VPC Flow Logs | resource.type="gce_subnetwork" | Network traffic for lateral movement and exfiltration detection |
| 5 | GKE Audit Logs | resource.type="k8s_cluster" | Kubernetes API activity for container security monitoring |
| 6 | Cloud DNS Logs | resource.type="dns_query" | DNS queries for C2 and exfiltration detection |
Data Access Audit Logs are disabled by default for most services and can be extremely high volume when enabled. Start with Admin Activity logs and enable Data Access selectively for sensitive services (BigQuery, Cloud Storage, IAM). See GCP documentation on configuring data access audit logs.
Next Steps
- Create detection rules for your GCP logs
- Configure enrichment to add GeoIP and threat intel to GCP source IPs
- Set up the AWS integration if you also use AWS