Enabling Server Access Logging property for all the objects in AWS S3

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance. This means customers of all sizes and industries can use it to store and protect any amount of data for a range of use cases, such as websites, mobile applications, backup and restore, archive, enterprise applications, IoT devices, and big data analytics. Amazon S3 provides easy-to-use management features so you can organize your data and configure finely-tuned access controls to meet your specific business, organisational, and compliance requirements. (Ref: https://aws.amazon.com/s3/)

As the organization grew, so is the data. The data can be in any format which will be used for various purposes like client data, organization data, reports generated from an application, data required to access certain feature in an application, etc. These data will be grouped in a folder structure for easy assessability. In AWS S3, we call these folders as buckets.

In order to ensure confidentiality for clients and for compliance purposes, it is good to enable logs for a bucket when requests are made to it for accessing the data. This log information can be used for security and audit purposes.

In AWS S3, server access logging is one such property that helps to achieve the above purpose.

To see the property, Login to the AWS Management console. Look for AWS S3 services,

Select any of the buckets, you will be able to see properties tab as shown in the below screenshot

When you navigate to properties we can see a lot of properties. Look for Server access logging property

Click on the server access logging, there will be two options displayed. By default, the “Disable logging” option will be selected.

Select the “Enable Logging” option, it will ask for a log storage bucket, provide the bucket name in which we want to store the logs generated by the current bucket.

Example: Create a bucket in the same account say, “log-archives-server-access-logs”. For the bucket “medium-sample-test-bucket” (shown in the first screenshot) enable the server access logging, give the Target bucket as “log-archives-server-access-logs” and Target prefix as “medium-sample-test-bucket”

Note: ‘target prefix’ is provided with the name of the bucket where we had enabled the server access logging. The reason for this is, consider you are going to use the same bucket “log-archives-server-access-logs” for collecting logs of all the buckets in the account. In order to identify the logs are from the respective bucket, we give the target prefix as the bucket name which was enabled for server access logging.

After clicking on save, the server access logging is enabled.

Problem Statement:

Consider a scenario where we need to enable this server access logging property for all the buckets in the account (Account A). The logs generated from all the bucket has to be collected in a separate log archive bucket with a subfolder (name of the subfolder should the bucket name for which logs are collected). The logs collected should be copied to another bucket in the compliance account (Account Z).

Let's see how this is done. To enable this property for each and every bucket and if the same activity has to be done to buckets in different accounts (dev, prod, staging, test) is a tedious task. To make ease of our work, let's script it out.

Steps to resolve the use-case:

  1. Create a bucket “server-access-logging-account-a” in Account Z and enable versioning. Attach bucket policy to it to enable access.
  2. Create a bucket “server-access-logs” in Account A. Enable versioning for the bucket.

Question: Why we need two buckets created? Why can't we give the bucket name created in Account Z directly to the bucket in Account A where server access logging property is enabled?

Answer: Cross account setup is not supported in Server Access Logging. The target bucket which we mention at the time of enabling this property should belong to the same account as of the bucket where we enable. [ ie. Target bucket where logs are stored and the bucket we enable the property should belong to Account A ]

3. As per our requirement, we need to copy the logs from the bucket (Step 2) in Account A to a bucket (Step 1) in Account Z. Since the cross-account setup is not supported, we need to find a way to do it.

Yes, Bucket replication is one of the possible solutions. We need to enable the replication property in Account A, pointing to the bucket in Account Z. Files logged into a bucket ( Step 2), will automatically get copied or replicated to the bucket in Account Z (Step 1). To implement this, we need to do the following,

  • Create a role ‘s3-replication-role’ in Account A, to enable the replication with assume-role policy
  • For the role created, create access for replication for buckets between Account A and Account Z. To do the same, create a policy “s3-replication-role-policy” and attach it to “s3-replication-role”
  • Create a replication setup policy JSON file with details of the bucket (“server-access-logging-account-a”) in Account Z and attach it to the bucket (“server-access-logs”) in Account A to enable the replication.
  • Loop through the entire buckets in the Account A and set the server access logging property with the target bucket as “server-access-logs” (step 2) and target prefix as “bucket name we are enabling this property”

Let's script it out from Step 2,

Role creation and trusted policy attachment
trustedpolicy.json
Account A Bucket Creation, versioning enabled as mentioned in step 2
Creating policy JSON file and attaching it to the role created
Creating replication policy JSON file and attaching it to the bucket created
Loop through all buckets, set up the replication via JSON file

Post-run of this script, buckets in S3 will be enabled with the server-access-logging property.

Bucket Policy to be set for a bucket in Account Z

Note: The masked part in the above screenshot is the account id of the Account A

Installation:

  1. aws-cli: https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html
  2. jq : https://stedolan.github.io/jq/download/
  3. Shell Script tutorial: https://www.tutorialspoint.com/unix/shell_scripting.htm

Happy Learning !!!

Senior Software Engineer | Java | Microservices | AWS | Terraform