Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EDU-709 - audit logging pages for GCP #3047

Merged
merged 23 commits into from
Sep 13, 2024
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
36af67d
splitting export page
MasonEgger Aug 27, 2024
6491eed
adding gcp export
MasonEgger Aug 28, 2024
d6d1f4d
first draft of audit logging with content from engineering
MasonEgger Aug 28, 2024
e58c1fa
fixing broken code sample
MasonEgger Aug 28, 2024
4c95877
removing line about example consuming audit log as content moved
MasonEgger Sep 3, 2024
0927fff
fixing broken anchor link
MasonEgger Sep 3, 2024
8d76e91
removing export as its in another commit
MasonEgger Sep 10, 2024
dfae828
adding export back
MasonEgger Sep 10, 2024
c26aec2
Update docs/production-deployment/cloud/audit-logging-aws.mdx
MasonEgger Sep 12, 2024
6dab0a3
Update docs/production-deployment/cloud/audit-logging-aws.mdx
MasonEgger Sep 12, 2024
68fe68b
Update docs/production-deployment/cloud/audit-logging.mdx
MasonEgger Sep 12, 2024
ae022ca
Update docs/production-deployment/cloud/audit-logging-aws.mdx
MasonEgger Sep 12, 2024
e52062e
moving examples back to main page
MasonEgger Sep 12, 2024
32d421b
removing gcp pubsub
MasonEgger Sep 12, 2024
f9fa253
Update docs/production-deployment/cloud/audit-logging.mdx
MasonEgger Sep 12, 2024
00534a5
updates per Rafaels feedback
MasonEgger Sep 12, 2024
10b917a
adding things that went missing
MasonEgger Sep 12, 2024
627ede4
adding correct example of audit log
MasonEgger Sep 12, 2024
74f4c55
Merge branch 'main' into gcp-edits-cloud-audit-logging
MasonEgger Sep 12, 2024
0db9129
update from Jwahir
MasonEgger Sep 12, 2024
056a043
removing GCP reference
MasonEgger Sep 12, 2024
3025d88
Merge branch 'main' into gcp-edits-cloud-audit-logging
MasonEgger Sep 13, 2024
99e8a06
final tweaks
MasonEgger Sep 13, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
176 changes: 176 additions & 0 deletions docs/production-deployment/cloud/audit-logging-aws.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,176 @@
---
id: audit-logging-aws
title: Audit Logging - GCP Pub/Sub
sidebar_label: Amazon Kinesis
description: Audit Logging in Temporal Cloud provides forensic information, integrating with GCP Pub/Sub for secure data handling and supporting key Admin and API Key operations. This streamlines audit and compliance processes.
slug: /cloud/audit-logging-aws
toc_max_heading_level: 4
keywords:
- audit logging
- explanation
- how-to
- operations
- temporal cloud
- term
- troubleshooting
- aws
- kinesis
tags:
- audit-logging
- explanation
- how-to
- operations
- temporal-cloud
- term
- troubleshooting
- aws
- kinesis
---

## How to configure Audit Logging using AWS Kinesis {#configure-audit-logging}
MasonEgger marked this conversation as resolved.
Show resolved Hide resolved

To set up Audit Logging, you must have an Amazon Web Services (AWS) account and set up Kinesis Data Streams.

1. If you don't have an AWS account, follow the instructions from AWS in [Create and activate an AWS account](https://aws.amazon.com/premiumsupport/knowledge-center/create-and-activate-aws-account/).
2. To set up Kinesis Data Streams, open the [AWS Management Console](https://aws.amazon.com/console/), search for Kinesis, and start the setup process.

You can use [this AWS CloudFormation template](https://temporal-auditlogs-config.s3.us-west-2.amazonaws.com/cloudformation/iam-role-for-temporal-audit-logs.yaml) to create an IAM role with access to a Kinesis stream you have in your account.

Be aware that Kinesis has a rate limit of 1,000 messages per second and quotas for both the number of records written and the size of the records.
For more information, see [Why is my Kinesis data stream throttling?](https://aws.amazon.com/premiumsupport/knowledge-center/kinesis-data-stream-throttling/)

### Create an Audit Log sink

1. In the Temporal Cloud UI, select **Settings**.
1. On the **Settings** page, select **Integrations**.
1. In the **Audit Logging** card, select **Configure Audit Logs**.
1. On the **Audit Logging** page, choose your **Access method** (either **Auto** or **Manual**).
- **Auto:** Configure the AWS CloudFormation stack in your AWS account from the Cloud UI.
- **Manual:** Use a generated AWS CloudFormation template to set up Kinesis manually.
1. In **Kinesis ARN**, paste the Kinesis ARN from your AWS account.
1. In **Role name**, provide a name for a new IAM Role.
1. In **Select an AWS region**, select the appropriate region for your Kinesis stream.

If you chose the **Auto** access method, continue with the following steps:

1. Select **Save and launch stack**.
1. In **Stack name** in the AWS CloudFormation console, specify a name for the stack.
1. In the lower-right corner of the page, select **Create stack**.

If you chose the **Manual** access method, continue with the following steps:

1. Select **Save and download template**.
1. Open the [AWS CloudFormation console](https://console.aws.amazon.com/cloudformation/).
1. Select **Create Stack**.
1. On the **Create stack** page, select **Template is ready** and **Update a template file**.
1. Select **Choose file** and specify the template you generated in step 1.
1. Select **Next** on this page and on the next two pages.
1. On the **Review** page, select **Create stack**.

### Audit Log format
MasonEgger marked this conversation as resolved.
Show resolved Hide resolved

The log sent to the Kinesis stream is JSON in the following format:

```json
{
"emit_time": // Time the operation was recorded
"level": // Level of the log entry, such as info, warning, or error
"user_email": // Email address of the user who initiated the operation
"caller_ip_address": // Customer IP address or server name
MasonEgger marked this conversation as resolved.
Show resolved Hide resolved
"operation": // Operation that was performed
"details": // Details of the operation
"status": // Status, such as OK or ERROR
"category": // Category of the log entry: Admin or System
"version": // Version of the log entry
"log_id": // Unique ID of the log entry
"principal": // Information about who initiated the operation
"request_id": // Optional async request id set by the user when sending a request
}
```

## How to consume an Audit Log {#consume-an-audit-log}

After you create an Audit Log sink, wait for the logs to flow into the Kinesis stream.
You should see the first logs 2–10 minutes after you configure the sink.
Subsequent logs arrive every 2 minutes if any actions occurred during that 2-minute window.

:::note

You must configure and implement your own consumer of the Kinesis stream.
For an example, see [Example of consuming an Audit Log](#example-of-consuming-an-audit-log).

:::

### Example of an Audit Log
MasonEgger marked this conversation as resolved.
Show resolved Hide resolved

The following example shows the contents of an Audit Log.

```json
{"emit_time":"2023-10-24T08:19:41Z","level":"LOG_LEVEL_INFO","user_email":"[email protected]","operation":"UpdateAccount","details":{"client_ca_fingerprints":["5bb99d14fa602f7d39b7d048674a2251"],"search_attribute_update":{}},"status":"OK","category":"LOG_CATEGORY_ADMIN","log_id":"0mc69c0323b871293ce231dd1c7fb634","principal":{"id":"988cb80b-d6be-4bb5-9c87-d09f93f58ed3","type":"user","name":"[email protected]"}}
**********
{"emit_time":"2023-10-25T21:16:42Z","level":"LOG_LEVEL_INFO","user_email":"[email protected]","operation":"DeleteUser","details":{"target_users":["0b741c47-e093-47d1-9b74-f2359129f78f"],"search_attribute_update":{}},"status":"OK","category":"LOG_CATEGORY_ADMIN","log_id":"0mc69c0323b871293ce231dd1c7fb635","request_id":"445297d3-43a7-4793-8a04-1b1dd1999641","principal":{"id":"b160473e-e40d-4a81-90d1-f4218269e6e4","type":"user","name":"[email protected]"}}
**********
{"emit_time":"2023-11-03T19:31:45Z","level":"LOG_LEVEL_INFO","user_email":"[email protected]","operation":"InviteUsers","details":{"target_users":["[email protected]"],"search_attribute_update":{}},"status":"OK","category":"LOG_CATEGORY_ADMIN","log_id":"0mc69c0323b871293ce231dd1c7fb636","principal":{"id":"35fdc757-9637-446b-b386-12ed475511ad","type":"user","name":"[email protected]"}}
**********
{"emit_time":"2023-11-08T08:06:40Z","level":"LOG_LEVEL_INFO","user_email":"[email protected]","operation":"UpdateUser","details":{"target_users":["[email protected]"],"search_attribute_update":{}},"status":"OK","category":"LOG_CATEGORY_ADMIN","log_id":"0mc69c0323b871293ce231dd1c7fb637","request_id":"445297d3-43a7-4793-8a04-1b1dd1999640","principal":{"id":"988cb80b-d6be-4bb5-9c87-d09f93f58ed3","type":"user","name":"[email protected]"}}
**********
{"emit_time":"2023-11-08T08:14:09Z","level":"LOG_LEVEL_INFO","user_email":"[email protected]","operation":"UpdateNamespace","details":{"namespace":"audit-log-test.example-dev","client_ca_fingerprints":["f186d0bd971ff7d52dc6cc9d9b6f7644"],"search_attribute_update":{}},"status":"OK","category":"LOG_CATEGORY_ADMIN","log_id":"0mc69c0323b871293ce231dd1c7fb638","principal":{"id":"988cb80b-d6be-4bb5-9c87-d09f93f58ed3","type":"user","name":"[email protected]"}}
**********
{"emit_time":"2023-11-08T09:20:22Z","level":"LOG_LEVEL_INFO","user_email":"[email protected]","operation":"UpdateUserNamespacePermissions","details":{"namespace":"audit-log-test.example-dev","search_attribute_update":{}},"status":"OK","category":"LOG_CATEGORY_ADMIN","log_id":"0mc69c0323b871293ce231dd1c7fb639","principal":{"id":"988cb80b-d6be-4bb5-9c87-d09f93f58ed3","type":"user","name":"[email protected]"}}
**********
```
MasonEgger marked this conversation as resolved.
Show resolved Hide resolved

### Example of consuming an Audit Log

The following Go code is an example of consuming Audit Logs from a Kinesis stream and delivering them to an S3 bucket.

```go
func main() {
fmt.Println("print audit log from S3")
cfg, err := config.LoadDefaultConfig(context.TODO(),
config.WithSharedConfigProfile("your_profile"),
)
if err != nil {
fmt.Println(err)
}
s3Client := s3.NewFromConfig(cfg)
response, err := s3Client.GetObject(
context.Background(),
&s3.GetObjectInput{
Bucket: aws.String("your_bucket_name"),
Key: aws.String("your_s3_file_path")})
if err != nil {
fmt.Println(err)
}
defer response.Body.Close()

content, err := io.ReadAll(response.Body)

fmt.Println(string(content))
}
```

The preceding code also prints the logs in the terminal.
The following is a sample result.

```json
{
"emit_time": "2023-11-14T07:56:55Z",
"level": "LOG_LEVEL_INFO",
"user_email": "[email protected]",
MasonEgger marked this conversation as resolved.
Show resolved Hide resolved
"operation": "DeleteUser",
"details": {
"target_users": ["d7dca96f-adcc-417d-aafc-e8f5d2ba9fe1"],
"search_attribute_update": {}
},
"status": "OK",
"category": "LOG_CATEGORY_ADMIN",
"log_id": "0mc69c0323b871293ce231dd1c7fb639",
"request_id": "445297d3-43a7-4793-8a04-1b1dd1999640",
"principal": {
"id": "988cb80b-d6be-4bb5-9c87-d09f93f58ed3",
"type": "user",
"name": "[email protected]"
}
}
```
169 changes: 169 additions & 0 deletions docs/production-deployment/cloud/audit-logging-gcp.mdx
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MasonEgger could we keep that page (and the link to it) in a separate PR, and merge the rest ?
@alice-yin is going to double-check if we want the GCP PubSub details exposed yet; putting this as a separate PR will allow it to be ready to merge whenever, but to already have the reorg done.
Thoughts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I will get a separate PR setup with the drafts. I may not update the other files until after the merge to avoid unnecessary merge conflicts and to make rebasing easier, but as soon as we do the GCP merge I'll rebase and get the links setup again.

Does this work for y'all?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah all good

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's the PR. #3080

Original file line number Diff line number Diff line change
@@ -0,0 +1,169 @@
---
id: audit-logging-gcp
title: Audit Logging - GCP Pub/Sub
sidebar_label: GCP Pub/Sub
description: Audit Logging in Temporal Cloud provides forensic information, integrating with GCP Pub/Sub for secure data handling and supporting key Admin and API Key operations. This streamlines audit and compliance processes.
slug: /cloud/audit-logging-gcp
toc_max_heading_level: 4
keywords:
- audit logging
- explanation
- how-to
- operations
- temporal cloud
- term
- troubleshooting
- gcp
- pubsub
tags:
- audit-logging
- explanation
- how-to
- operations
- temporal-cloud
- term
- troubleshooting
- gcp
- pubsub
---

## **Prerequisites**

Before configuring the Audit log Sink, please complete the following steps in Google Cloud:

1. Create a PubSub topic and take note of its topic name, for example, "test-
auditlog"
1. If you wish to enable customer-managed encryption keys (CMEK), do so
2. Record the GCP Project ID that owns the topic
3. Set up a service account in the same project that trusts the Temporal internal service account to let Temporal write information to your account. Follow the instructions in the Temporal Cloud UI, there are two ways to set up this service account:
1. Input the service account ID, GCP project ID and PubSub topic name
1. Follow the instructions, manually set up a new service account
2. Use the [Terraform template](https://github.com/temporalio/terraform-modules/tree/main/modules/gcp-sink-sa) to create the service account

## **Temporal Cloud UI**

![Temporal Cloud UI Setup for Audit Logging with GCP Pub/Sub](/img/audit-logging-pub-sub-gcp.png)

1. In the Cloud UI, navigate to the Settings → Integration Page → Audit Log, confirm that you see Pub/Sub as a sink option
2. Configure the Audit Log
1. Choose Pub as Sink type
2. Provide the following information
1. Service account ID: [from Prerequisite 3]
2. GCP Project ID: [from Prerequisite 2]
3. Pub/Sub topic name: [from Prerequisite 1]
3. Once you have filled in the necessary values, please click on “Create” to get Audit Log Configured
4. Please check back in few minutes to make sure everything set up successfully

## More information

More details available in our public-facing documentation: https://docs.temporal.io/cloud/audit-logging

### Example of consuming an Audit Log

The following Go code is an example of consuming Audit Logs from a PubSub stream

```go
package main
import (
"fmt"
"io/ioutil"
"os"
"github.com/gogo/protobuf/jsonpb"
// TODO: change path to your generated proto
export "generated/exported_workflow"
"go.temporal.io/api/common/v1"
enumspb "go.temporal.io/api/enums/v1"
// TODO: change path to temporal repo
ossserialization "go.temporal.io/server/common/persistence/serialization"
)
func extractWorkflowHistoriesFromFile(filename string) ([]*export.Workflow, error) {
bytes, err := ioutil.ReadFile(filename)
if err != nil {
return nil, fmt.Errorf("error reading from file: %v", err)
}
blob := &common.DataBlob{
EncodingType: enumspb.ENCODING_TYPE_PROTO3,
Data: bytes,
}
result := &export.ExportedWorkflows{}
err = ossserialization.ProtoDecodeBlob(blob, result)
if err != nil {
return nil, fmt.Errorf("failed to decode file: %w", err)
}
workflows := result.Workflows
for _, workflow := range workflows {
history := workflow.History
if history == nil {
return nil, fmt.Errorf("history is nil")
}
}
return workflows, nil
}
func printWorkflow(workflow *export.Workflow) {
// Pretty print the workflow
marshaler := jsonpb.Marshaler{
Indent: "\t",
EmitDefaults: true,
}
Export Feature (User Copy)
9
str, err := marshaler.MarshalToString(workflow.History)
if err != nil {
fmt.Println("error print workflow history: ", err)
os.Exit(1) }
print(str) }
func printAllWorkflows(workflowHistories []*export.Workflow) {
for _, workflow := range workflowHistories {
printWorkflow(workflow)
}
}
func printWorkflowHistory(workflowID string, workflowHistories []*export.Workflow) {
if workflowID == "" {
fmt.Println("invalid workflow ID")
os.Exit(1) }
for _, workflow := range workflowHistories {
if workflow.History.Events[0].GetWorkflowExecutionStartedEventAttributes().WorkflowId
== workflowID {
fmt.Println("Printing workflow history for workflow ID: ", workflowID)
printWorkflow(workflow)
} }
fmt.Println("No workflow found with workflow ID: ", workflowID)
}
func main() {
if len(os.Args) < 2 {
fmt.Println("Please provide a path to a file")
os.Exit(1) }
filename := os.Args[1]
fmt.Println("Deserializing export workflow history from file: ", filename)
workflowHistories, err := extractWorkflowHistoriesFromFile(filename)
if err != nil {
fmt.Println("error extracting workflow histories: ", err)
os.Exit(1)
}
fmt.Println("Successfully deserialized workflow histories")
fmt.Println("Total number of workflow histories: ", len(workflowHistories))
fmt.Println("Choose an option:")
fmt.Println("1. Print out all the workflows")
fmt.Println("2. Print out the workflow hisotry of a specific workflow. Enter the workflow ID:")
var option int
fmt.Print("Enter your choice: ")
_, err = fmt.Scanf("%d", &option)
if err != nil {
fmt.Println("invalid input.")
os.Exit(1) }
switch option {
case 1:
printAllWorkflows(workflowHistories)
case 2:
fmt.Println("Please provide a workflow ID:")
var workflowID string
_, err = fmt.Scanf("%s", &workflowID)
if err != nil {
fmt.Println("invalid input for workflow ID.")
os.Exit(1) }
printWorkflowHistory(workflowID, workflowHistories)
default:
fmt.Println("only options 1 and 2 are supported.")
os.Exit(1) }
}
```
Loading