Uncover hidden risks

Watch how the Wiz platform can expose unseen risks in your cloud environment without drowning your team in alerts.

AcademyAWS S3 Security Best Practices

AWS S3 Security Best Practices

This article will refresh your knowledge of AWS and S3 security basics and then move into the best practices you need to get started with S3 security.

Wiz Experts Team
7 min read

As its name implies, storing your data in the cloud with Amazon Simple Storage Service (S3) is a streamlined process, but it’s important to remember there are still risks. With each file you store on S3, you increase the chances of compromised private data if threat actors breach the system. That’s why following security best practices is a must.

This article will refresh your knowledge of AWS and S3 security basics and then move into the best practices you need to get started with S3 security.

S3 security and the shared responsibility model

The first step toward mastering S3 security is understanding where Amazon’s responsibilities stop and yours start. To clarify who is responsible for what, AWS uses a framework called the shared responsibility model. This model defines what they see as their responsibilities and what falls to AWS customers.

Simply put, AWS takes care of the security of the cloud. They are responsible for cloud infrastructure, like data centers, network architecture, and the hardware that runs the services. They hire and manage the security staff who protect data center buildings (and everything in them) and ensure that the networks connecting everything are safe from malicious actors.

Your responsibility is the security in the cloud, which relies on following the least privilege principle—or giving out only the permissions required for the task at hand, and not more. The least privilege principle includes all the permission systems AWS offers for S3, and adhering to the principle can differ depending on which of them you use. For example, while the least privilege with IAM might be read-only access to a single bucket, it could be read-only access to a single object when using access control lists because they offer more granular access control options.

Pro tip

Remember that the least privilege principle still requires encrypting your data if an attacker circumvents access controls by finding a security vulnerability. And it’s also important to adopt AWS S3 security best practices across the software development lifecycle; otherwise, your data might be secure inside your production environment but at risk from a forgotten developer account in the development environment.

Of course, the shared responsibility model brings a few challenges. As its name implies, S3 is considered a simple service, and it’s not as complex as a database or block storage: You just upload your files, and S3 takes care of the rest. However, this doesn’t free you from learning about best practices or configuration options in detail. Often, AWS customers tend to underestimate their configuration responsibilities, which can lead to problems. Even big organizations like Microsoft have made this error: In 2022, thousands of Microsoft records containing sensitive user data were exposed because of a misconfiguration. And our research shows that permissions are often overlooked in the shared responsibility model. In fact, 76% of companies have 3rd party roles that allow for full account takeover

S3 security best practices and recommendations

Now that you understand the basic idea behind security in the cloud, let’s go more into detail.

Store only the required data

If you created a weather app, you wouldn’t ask users for their medical history, but sometimes it’s not 100% clear what user data you need to run your service. You might be tempted to get a user’s phone number, even if their email address might be enough for your use case. 

Take a moment and think about your data requirements. Since nobody can steal data that doesn’t exist, ensure you store only what you need. Organizations tend to collect as much data from their users as possible because knowledge is power, but data—especially private or health-related data—can quickly become a liability. 

There’s another reason for minimal data collection. The data you store requires you to consider its security, which means additional work. Data you don’t keep is secure by default, opening up resources for more important tasks.

This also means deleting sensitive data you don’t need anymore as soon as possible. Sure, sometimes compliance with specific laws requires you to keep particular data around, but besides that, forgetting to delete susceptible objects in a bucket could put you at risk. The only thing worse than leaking data is leaking data users didn’t think you (still) had.

Implement least privilege access

We discussed it above, but it can’t be stressed enough. The principle of least privilege is at the core of all security best practices. The principle states that you should assign only the permissions required for a particular job. If you give a service too many permissions, even the best security tools and encryption can’t protect your data. 

If you don’t take the time to create a custom IAM role that can only read objects from one bucket but instead assign S3 full-access policies everywhere or even create public buckets, AWS will let that happen—whether it’s a good idea or not.

On S3, the principle of least privilege involves limits on which buckets and objects a service can access and the actions it can perform on them. If a service just uses one bucket, don’t give it access to all buckets. If a service only reads data, don’t give it write access. 

The worst-case scenario is providing S3 admin access to a service. With admin access, hackers can read, write, update, and delete your data if that service is compromised.

Encrypt data at rest and in transit

Encryption is the next crucial piece of the S3 security puzzle. Encrypted data is protected from unauthorized reads, even if an attacker accesses it via a channel you didn’t anticipate. 

Encryption at rest means the data is encrypted when written to a bucket. If AWS fails to provide security of the cloud and someone gets access to your S3 buckets, they won’t be able to use your encrypted data.

Encryption in transit, on the other hand, means data gets encrypted while it’s transferred over a network. Always upload and download data from your buckets via HTTPS; otherwise, everyone on the network can read what you send.

Figure 1: Enabling bucket encryption

Enable versioning for buckets

Things always go wrong, especially when you don’t think they will. Versioning gives you a safety net for your data and widens your margin of error when using S3. If someone accidentally or maliciously edits or deletes an object from a bucket, S3 will create a new version and keep the old one. For the consuming service, a versioned bucket behaves like a regular one; S3 hides old versions by default but allows you to access them via additional actions. 

The downside of versioning is that it raises storage costs. If your objects are prone to many changes, and every change results in a copy, costs might multiply quickly. Consider lifecycle rules that store older versions on cheaper storage tiers in this case.

Figure 2: Enabling bucket versioning

Enable logging for buckets

Logs are the bread and butter of the cloud because they give you insights into access patterns and who is using your data on S3. With logging, you can check who is accessing what and use this information to tighten access rules or find out someone accessed data they shouldn’t have.

Some compliance standards require logging. To meet compliance standards, it’s important each data request can be tied to the user who made it. For example, insider trading laws forbid people with inside knowledge to participate in trades relating to that knowledge. Logging allows you to check who accessed insider information and sue them if they try to use it for trading.

Figure 3: Enabling bucket access logging

Block public buckets at the account or organization level

Blocking the creation of public buckets entirely in your AWS account or organization is useful for enforcing explicit access rules for all buckets. When nobody can create a public bucket anymore, everyone has to use IAM, bucket policies, or access control lists to make a bucket accessible. 

If you want to block public buckets for a few accounts, or you just have one account, you can also block at account level.

Figure 4: Blocking public buckets at the account level

Enable AWS Backup for important buckets

AWS Backup is an automatic backup service for AWS storage. As with versioning, it allows for a higher margin of error. Whether data was deleted or modified accidentally or maliciously, AWS Backup enables you to restore it. 

Figure 5: Enabling AWS Backup for a bucket

AWS Backup might seem redundant to bucket versioning, but it’s more flexible. While bucket versioning will save a new object version for every change, AWS Backup will save a whole bucket at a specific interval and allows you to define how long it should retain that backup. You can even have multiple backups triggered at different intervals. AWS Backup also lets you restore the data in a different location, while bucket versioning will do it in place. These features make AWS Backup better suited to disaster recovery than bucket versioning.

Going beyond the basics with Wiz 

Wiz helps organizations identify and remediate critical risks in their AWS environments. It integrates with 50+ AWS services to provide complete visibility into your cloud estate and uses machine learning to identify risks that are often missed by traditional security tools.

How Wiz works with AWS:

  • Visibility and context: Wiz integrates with AWS services to collect logs and other data from your AWS resources, and uses machine learning to identify patterns that indicate risks. For example, Wiz can integrate with AWS CloudTrail to collect logs from your AWS resources and identify patterns that indicate suspicious activity.

  • Secure ML models: Wiz recently announced support for Amazon SageMaker. Joint customers can now have peace of mind because Wiz can monitor and manage the security risks associated with building and deploying AI/ML models.

  • Remediation recommendations: Once Wiz has identified a risk, it provides specific recommendations for remediation, such as changing a password for a user or enabling two-factor authentication for a resource.

  • Remediation automation: Wiz can automate the remediation of some risks, such as changing passwords or enabling two-factor authentication. This can help organizations to reduce the time and effort required to keep their AWS environments secure.

Agentless Full Stack coverage of your AWS Workloads in minutes

Learn why CISOs at the fastest growing companies choose Wiz to help secure their AWS environments

Get a demo

Other security best practices you might be interested in: