Google Cloud Platform (GCP) Audit Log

Reading time: 15 min

What is a Google Cloud Platform (GCP) audit log?

Google Cloud Platform audit logs are essentially a continuous record of activity within your cloud environment. They track "who did what, where, and when" across various Google Cloud services, providing invaluable insights into user actions, system events, and resource changes. These logs answer crucial questions like:

  • Who created a new Cloud Storage bucket?
  • Which user deleted a virtual machine instance?
  • Was there any attempt to modify IAM permissions?
  • Did sensitive data access occur from unauthorized locations?

GCP audit logs encompass four main categories:

  • Admin Activity: Tracks administrative actions like modifying IAM roles, creating service accounts, or managing billing settings.
  • Data Access: Captures activities involving read/write operations on sensitive data across various GCP services like Cloud Storage, BigQuery, and Cloud SQL.
  • Policy Denied: Records instances where access to resources was denied due to insufficient permissions or security policies.
  • System Event: Logs internal system events like VM boot/shutdown, project creation/deletion, or service maintenance activities.

Why are Google Cloud Platform audit logs important?

GCP audit logs are vital for several reasons:

  • Security and Compliance: They provide audit trails for regulatory compliance, incident investigation, and threat detection. By analyzing logs, you can identify suspicious activity, track user actions, and reconstruct events for forensic analysis.
  • Troubleshooting and Debugging: Audit logs are crucial for troubleshooting operational issues. For example, if a database update fails, you can pinpoint the user and action through the logs to diagnose the cause.
  • Cost Optimization: Logs can reveal underutilized resources or unauthorized activities leading to unnecessary costs. Monitoring access patterns and resource usage can help optimize your cloud expenses.
  • Resource Management: Logs track changes to resource configurations, helping you understand who made alterations and why. This transparency and accountability support effective resource management and prevent unauthorized modifications.

The Structure of GCP Audit Logs: A Detailed Breakdown

1. Log Entry Structure:

All GCP audit logs share a common base structure, represented by the LogEntry object. This object contains several key fields:

  • name: The unique identifier of the log stream (e.g., projects//logs/audit.googleapis.com)
  • resource: The resource affected by the event (e.g., projects//buckets/)
  • timestamp: The time of the event (UTC)
  • severity: The severity level of the event (e.g., INFO, WARNING, ERROR)
  • insertId: A unique identifier for the log entry within the stream
  • protoPayload: The service-specific payload containing detailed event information

2. Log Types:

 

GCP audit logs fall into three main categories:

  • Admin Activity: Logs related to administrative actions like creating users, modifying roles, or updating resource configurations.
  • Data Access: Logs related to accessing data within your GCP resources, categorized as data_read, data_write, and admin_read.
  • System Event: Logs related to system events like resource creation, deletion, or state changes.

3. Payload Structure:

The protoPayload field within the LogEntry object contains service-specific details about the event. Its structure varies depending on the service and event type. However, it typically includes:

  • service: The Google Cloud service that generated the log (e.g., storage.googleapis.com)
  • method: The API call made (e.g., storage.objects.insert)
  • resource: Detailed information about the affected resource
  • principal: The identity of the user or service that initiated the event
  • request: Details about the request parameters
  • response: Details about the response from the service

4. Buckets:

 

A GCP Audit Logs bucket serves two primary functions:

  1. Storing GCP Audit Logs: These logs hold a chronological record of events related to access, modifications, and other actions performed on various Google Cloud resources. This includes activities like creating Cloud Storage buckets, modifying IAM roles, or deleting Compute Engine instances. Enabling audit logging for specific resources ensures a detailed trail of who did what, when, and how.
  2. Routing and Managing Audit Logs: You can configure GCP Audit Logs to send entries to a bucket you create specifically for this purpose. This allows you to:
  • Centralize logs: All audit logs for your project or organization can be stored in a single bucket, simplifying analysis and management.
  • Integrate with other tools: You can export logs from the bucket to BigQuery for deeper analysis using SQL queries and visualizations. Additionally, integration with various security and compliance tools can be facilitated.
  • Control access and retention: You can define access controls for the bucket, ensuring only authorized individuals can access the audit logs. Furthermore, you can set retention policies, specifying how long the logs should be stored before being automatically deleted.
Explaining the information security
Explaining the information security
Get the answers on how to enhance the protection of your company in an efficient and easy manner. .

Examples:

Object Create/Delete: Logs the creation and deletion of objects within any Cloud Storage bucket. Log name: resource.type=gcs_bucket and severity>=AUDIT_LOG.

Bucket Create/Delete: Tracks the creation and deletion of Cloud Storage buckets themselves. Log name: resource.type=gcs_bucket and severity>=INFO.

Object Access: Captures read/write attempts on Cloud Storage objects, revealing potential unauthorized access. Log name: resource.type=gcs_bucket and severity>=AUDIT_LOG.

5. Logging Sinks and Pub/Subs:

Logging Sinks: Imagine them as diligent gatekeepers, watching over your log streams. They listen for specific criteria you set (like log types, severity levels, or resource names) and then, when a log message matches, whisk it away to its designated destination. These destinations can be diverse, like storage buckets for archiving, BigQuery for analysis, or even external services like PagerDuty for alerting. Logging Sinks essentially filter and distribute your logs, keeping them organized and accessible.

Pub/Sub: Picture it as a bustling, high-speed message bus. When you publish a log message to Pub/Sub, it instantly broadcasts it to any subscribers waiting eagerly. These subscribers can be diverse too, including Cloud Functions for real-time processing, Dataflow for streaming analytics, or even other GCP services like Cloud Spanner or Cloud SQL for further integration. Pub/Sub acts as a real-time communication hub, ensuring your logs reach their destinations swiftly and securely.

Examples:

Sink Creation/Deletion: Monitors the creation and deletion of logging sinks that route logs to external destinations. Log name: resource.type=logging_sink and severity>=INFO.

Pub/Sub Topic Creation/Deletion: Tracks the creation and deletion of Pub/Sub topics, used for real-time message delivery. Log name: resource.type=pubsub_topic and severity>=INFO.

Pub/Sub Publish/Subscribe: Records message publishing and subscription events to Pub/Sub topics, indicating potential data flow breaches. Log name: resource.type=pubsub_topic and severity>=AUDIT_LOG.

6. Additional Information:

GCP audit logs may also include additional fields depending on the service and event type, such as:

  • ip: The IP address of the caller
  • userAgent: The user agent string of the browser or client making the request
  • status: The HTTP status code of the request/response
  • customLabels: User-defined labels added to the log entry

Google Cloud Platform Log Types

Admin Activity Logs:

These logs record actions taken by users or service accounts that modify administrative settings, like creating IAM roles or updating billing configurations.

Key elements:

activity: Field containing the specific action performed (e.g., "role.create", "project.update").

user: Identity of the user or service account who initiated the action.

previousState: Previous state of the resource before the action.

newState: New state of the resource after the action.

newState: New state of the resource after the action.

 

Example:

{

"timestamp": "2023-12-19T00:00:00Z",

"severity": "Info",

"resource": "projects/my-project",

"resource_identifier": "my-project",

"project": "my-project",

"service": "iam",

"activity": "role.create",

"user": "[email protected]",

"previousState": { },

"newState": {

"name": "roles/storage.admin"

}

}

 

System Event Logs:

These logs record actions taken by the Google Cloud system, such as resource creation or deletion.

Key elements:

systemEventType: Specific event type triggered by the system (e.g., "resource.create", "resource.delete").

eventDescription: Detailed description of the event.

Example:

{

"timestamp": "2023-12-19T00:01:00Z",

"severity": "Info",

"resource": "projects/my-project/compute/instances/my-instance",

"resource_identifier": "projects/my-project/zones/us-central1-a/instances/my-instance",

"project": "my-project",

"service": "compute.googleapis.com",

"systemEventType": "resource.create",

"eventDescription": "Instance my-instance created in zone us-central1-a."

}

Data Access Logs:

These logs record user and application access to data stored in Google Cloud services like Cloud Storage and BigQuery.

Key elements:

data_access: Field indicating the type of data access (e.g., "read", "write", "delete").

dataset: Dataset or table accessed within the service (e.g., "my-bucket", "my-dataset").

ip: IP address of the source accessing the data.

Example:

{

"timestamp": "2023-12-19T00:02:00Z",

"severity": "Info",

"resource": "projects/my-project/buckets/my-bucket",

"resource_identifier": "projects/my-project/buckets/my-bucket",

"project": "my-project",

"service": "storage.googleapis.com",

"data_access": "read",

"dataset": "my-object.txt",

"ip": "192.168.1.1"

}

SearchInform provides services to companies which
Face risk of data breaches
Want to increase the level of security
Must comply with regulatory requirements but do not have necessary software and expertise
Understaffed and unable to assess the need to hire expensive IS specialists

How to Export Audit Logs from Google Cloud Platform

There are three primary methods for exporting audit logs from GCP:

 

1. Cloud Storage:

  • Simple and cost-effective: This is the easiest way to export logs, with minimal configuration required.
  • Flexibility: You can choose the storage format (JSON, CSV) and compression level.
  • Scalability: Cloud Storage can handle large volumes of logs efficiently.

Steps:

  • Create a Cloud Storage bucket for storing exported logs.
  • In the Cloud Logging console, navigate to the desired log sink.
  • Click on "Create sink" and select "Cloud Storage" as the destination.
  • Configure the bucket name, file format, and other settings.
  • Click on "Create" to start exporting logs.

2. BigQuery:

  • Advanced analysis: BigQuery enables powerful querying and analysis of exported logs.
  • Integration with other datasets: You can combine audit logs with other data sources stored in BigQuery for comprehensive insights.
  • Cost considerations: BigQuery charges for data storage and queries, making it potentially more expensive than Cloud Storage for long-term log storage.

Steps:

  • Create a BigQuery dataset for storing exported logs.
  • In the Cloud Logging console, navigate to the desired log sink.
  • Click on "Create sink" and select "BigQuery" as the destination.
  • Configure the BigQuery dataset and table details.
  • Click on "Create" to start exporting logs.

3. Cloud Pub/Sub:

  • Real-time streaming: Ideal for scenarios where immediate processing of audit logs is required.
  • Integration with other applications: You can use Cloud Pub/Sub to send logs to external applications for real-time analysis or alerting.
  • Complexity: Setting up and managing Cloud Pub/Sub integrations can be more complex than the other methods.

Best practices for using Google Cloud Platform audit logs

Using GCP audit logs effectively can significantly improve your security posture and compliance. Here are some best practices to get the most out of them:

Log Selection and Configuration:

  • Focus on key logs: Not all logs are created equal. Identify the services and activities crucial to your security and compliance needs. Prioritize monitoring logs for admin activity, data access, system events, and policy violations for these areas.
  • Leverage Data Access logs thoughtfully: These logs track resource read/write activity and are not enabled by default. Enable them selectively for sensitive resources and test the configuration in a dedicated project before rolling out to production.
  • Define clear retention policies: Logs take up storage space and incur costs. Determine how long you need to retain logs based on compliance requirements and investigation needs. Use predefined or custom retention policies to manage log life cycles efficiently.

Security and Access Control:

  • Protect your logs: Enable customer-managed encryption keys for Logs Router to secure logs at rest and in transit.
  • Limit access: Implement strict IAM controls to restrict who can access and manage audit logs. Consider separate roles for viewing, analyzing, and managing logs.
  • Monitor for configuration changes: Enable audit logging for changes to audit logging configurations themselves. This allows you to track any modifications that might alter log collection or retention.

Analysis and Monitoring:

  • Utilize log analysis tools: Integrate Cloud Audit Logs with SIEM tools or Cloud Monitoring to analyze logs for anomalies, suspicious activity, and compliance violations.
  • Set up alerts: Configure alerts to notify you of critical events identified in your audit logs, such as unauthorized access attempts or policy violations.
  • Regularly review logs: Don't let valuable insights go unnoticed. Dedicate time to reviewing logs for potential security incidents, compliance issues, and operational trends.

Additional Tips:

  • Use test projects: Validate your audit log configuration in a dedicated test project before applying it to production environments.
  • Follow Google's recommendations: Google provides detailed documentation and best practices guides for Cloud Audit Logs. Refer to them regularly for the latest information and recommendations.
  • Stay informed: Keep up with updates and new features related to Cloud Audit Logs. Attend webinars, read blog posts, and explore documentation updates to stay ahead of the curve.

Additional Resources:

  • Cloud Audit Logs overview: https://cloud.google.com/logging/docs/audit
  • Understanding audit logs: https://cloud.google.com/logging/docs/audit
  • IAM audit logging: https://cloud.google.com/iam/docs/audit-logging
Keep your corporate data safe
and perform with SearchInform DLP:
Control of most crucial data transfer channels or those you need
Detailed archiving of incidents
Unique Analytical Features (OCR, Similar Content Search, Image Search, etc.)
Deployment on your infrastructure or in the cloud, including Microsoft 365

Collecting and Analyzing GCP Audit Logs with FileAuditor

FileAuditor is a powerful tool for collecting and analyzing Google Cloud Platform (GCP) audit logs. It offers a user-friendly interface and a variety of features to help you understand and secure your GCP environment. Here's a detailed guide on how to use FileAuditor for this purpose:

1. Setting Up FileAuditor:

  • Create a FileAuditor Account: Sign up for a free trial or paid plan on the FileAuditor website (https://searchinform.com/products/searchinform-fileauditor/).
  • Connect GCP Account: In FileAuditor, navigate to "Connections" and click "Add Connection." Choose "Google Cloud Platform" and follow the on-screen instructions to grant FileAuditor access to your GCP project.
  • Configure Audit Logs: Go to "Audit Logs" and select the desired logs you want to collect. You can choose from various logs like Admin Activity, Data Access, or System Events. You can also filter logs by specific projects, resources, or users.

2. Collecting Audit Logs:

  • FileAuditor Agent: FileAuditor recommends deploying its agent on a Compute Engine instance in your GCP project for efficient log collection. Follow the instructions provided in FileAuditor to install and configure the agent.
  • Direct Log Upload: Alternatively, you can upload audit logs directly from Cloud Storage buckets to FileAuditor. Ensure the bucket permissions are set correctly to allow FileAuditor access.

3. Analyzing Audit Logs:

  • Log Viewer: Once logs start flowing in, you can access them in the "Log Viewer." Here, you can filter and search logs based on various criteria like timestamps, log types, resources, or users.
  • Visualizations: FileAuditor provides various visualizations like charts and graphs to help you identify trends and patterns in your audit logs. This can be helpful for detecting suspicious activity or understanding common access patterns.
  • Alerts and Reports: You can set up custom alerts to be notified when specific events occur in your audit logs, such as unauthorized access attempts or resource deletions. FileAuditor also allows you to generate reports summarizing your audit log data.

4. Advanced Features:

  • Compliance: FileAuditor can help you meet various compliance requirements by providing reports and insights into your audit logs. It supports compliance frameworks like SOC 2, GDPR, and PCI DSS.
  • Anomaly Detection: FileAuditor uses machine learning to detect anomalies in your audit logs, such as unusual access patterns or suspicious activity. This can help you proactively identify potential security threats.
  • Integration with Other Tools: FileAuditor integrates with various SIEM and security tools to provide a comprehensive view of your security posture.

Additional Tips:

  • Start with collecting a small subset of logs and gradually expand as you get comfortable with FileAuditor.
  • Use filters and search effectively to focus on relevant audit data.
  • Set up alerts for critical events to be notified promptly.
  • Regularly review reports and visualizations to identify trends and patterns.
  • Explore FileAuditor's advanced features like anomaly detection and compliance reporting.

Secure your GCP environment with FileAuditor: Collect & Analyze Audit Logs Effortlessly!

SearchInform Managed Security Service
Extend the range of addressed challenges with minimum effort

Company news

All news
Letter Subscribe to get helpful articles and white papers. We discuss industry trends and give advice on how to deal with data leaks and cyber incidents.