Workshop 10: S3

Workshop Goals

In this hands-on workshop, you’ll learn how to:
• Create and configure S3 buckets via AWS CLI
• Upload, sync, and manage objects in S3
• Apply bucket policies and access controls
• Enable versioning and lifecycle rules
• Host a static website using S3
• Clean up buckets and objects to avoid charges

Prerequisites

• AWS CLI v2 installed and configured
• IAM permissions: s3:CreateBucket, PutObject, DeleteObject, PutBucketPolicy, PutBucketVersioning, PutBucketLifecycleConfiguration
• Basic knowledge of AWS regions and CLI usage

1. Create an S3 Bucket

First, you will create a new S3 bucket in your preferred region. Buckets serve as top‑level containers for storing objects. Bucket names must be globally unique.

aws s3api create-bucket \
  --bucket my-unique-workshop10-bucket \
  --region us-east-1 \
  --create-bucket-configuration LocationConstraint=us-east-1

create-bucket Parameters Explained

--bucket Name of the bucket (must be unique across AWS).
--region AWS region to create the bucket in.
--create-bucket-configuration LocationConstraint Required for regions other than us-east-1 to specify location.

2. Upload an Object

Next, upload a local file to the new bucket. This demonstrates how to store objects and set up folder prefixes.


aws s3 cp ./local-file.txt s3://my-unique-workshop10-bucket/documents/local-file.txt

cp Parameters Explained

cp Command to copy files between local and S3.
./local-file.txt Source path on local filesystem.
s3://bucket/key Destination S3 URI including bucket and key (prefix).

3. Sync a Directory

Use the sync command to recursively upload all files in a directory. This ensures the bucket mirrors the local folder, transferring only new or changed files.

aws s3 sync ./website-content s3://my-unique-workshop10-bucket/website --delete

sync Parameters Explained

sync Recursively copies files and only transfers changes.
--delete Removes files in destination not present in source to keep mirror.

4. Apply a Bucket Policy

To control access, apply a bucket policy that allows public read access or restricts to specific principals. Create a JSON policy file and use the CLI to apply it.


# policy.json contains JSON granting s3:GetObject to everyone
aws s3api put-bucket-policy \
  --bucket my-unique-workshop10-bucket \
  --policy file://policy.json

put-bucket-policy Parameters Explained

--bucket Target bucket name.
--policy Path to the JSON file defining permissions.

5. Enable Versioning and Lifecycle Rules

Versioning protects against accidental deletions and overwrites by keeping multiple object versions. Lifecycle rules automate transitions to cheaper storage or expiration.

# Enable versioning\aws s3api put-bucket-versioning \
  --bucket my-unique-workshop10-bucket \
  --versioning-configuration Status=Enabled

# Add lifecycle rule to transition older objects to Glacier
aws s3api put-bucket-lifecycle-configuration \
  --bucket my-unique-workshop10-bucket \
  --lifecycle-configuration file://lifecycle.json

Lifecycle JSON file defines rules, e.g.:

{
  "Rules": [
    {
      "ID": "TransitionToGlacier",
      "Status": "Enabled",
      "Prefix": "backup/",
      "Transitions": [
        {"Days": 30, "StorageClass": "GLACIER"}
      ],
      "Expiration": {"Days": 365}
    }
  ]
}

6. Host a Static Website

S3 can serve static website content directly. Enable website hosting on the bucket and configure index and error documents.

aws s3 website s3://my-unique-workshop10-bucket/ \
  --index-document index.html \
  --error-document error.html

website Parameters Explained

--index-document Default document for root requests.
--error-document Page to show for 4XX errors.

7. Clean Up Buckets and Objects

To avoid ongoing charges, delete all objects and the bucket when finished.


# Remove all objects and versions\aws s3 rb s3://my-unique-workshop10-bucket --force

rb Parameters Explained

rb --force Removes bucket and all contents including versions.

Next Steps

Explore S3 Transfer Acceleration for faster uploads, cross-region replication for disaster recovery, and S3 Event Notifications to trigger workflows with Lambda or SQS.

Previous: Workshop 9: RDS | Next: Workshop 11: Route 53

<
>