Uncover hidden risks

Watch how the Wiz platform can expose unseen risks in your cloud environment without drowning your team in alerts.

Serverless Security Explained

Serverless security is the extra layer of protection designed for applications built on a serverless architecture. In this type of cloud computing, you write the code (functions) but the cloud provider handles the servers. This creates a different security approach.

Wiz Experts Team
7 min read

Serverless Computing: A refresher

Serverless computing is a cloud computing model where you, the developer, focus on writing and deploying code without worrying about the underlying servers or infrastructure. The cloud provider handles everything from provisioning servers to scaling them based on usage.

Serverless architectures eliminate many chores that come with cloud infrastructure setup and maintenance, including security-related tasks like installing security patches for language runtimes and operating systems. However, using a serverless architecture doesn’t mean your apps are invulnerable.

Security remains a concern in serverless computing. The shared responsibility model of cloud services, including serverless computing, shifts some security responsibilities to the cloud service provider but doesn't eliminate all security concerns for the client.

This article explains serverless security, introduces common security threats for serverless applications, and gives actionable advice about preventing them. Ready to make the most of serverless architecture? Let’s dive in.

What is serverless security?

Serverless security consists of best practices and techniques that protect serverless workloads from unauthorized access. 

Function as a service (FaaS), where you just implement a function that responds to events, usually only exposes a fraction of the underlying instance features to the developer. Consequently, the hidden features are the cloud provider's responsibility. For organizations and developers, serverless security focuses on the new challenges serverless architectures bring, like keeping track of the increased number of cloud resources, each of which is a potential attack vector for a malicious user.

What benefits do serverless architectures bring?

Let’s look at the reasons why you would use serverless architecture before we take a closer look at the security aspects.

Flexible managed services

Serverless services are a sub-category of managed services. Their main differentiator is on-demand billing, which means that if they go unused, you incur zero costs. For instance, FaaS lets you write complete programs in any programming language without worrying about infrastructure maintenance. Using FaaS, you can program your own backend without the downside of keeping your OS up to date or paying a monthly subscription for an instance you might not use 100% of the time.

Improved security

Like with all managed services, the cloud provider takes care of the OS and runtime security in a serverless environment. 

Another upside? FaaS is mostly stateless, which is easier to maintain. Functions are distributed over different instances, and each execution is isolated and might not have access to the state of the previous one. If there is no state of previous executions available, an attacker can’t use it.

Some other security benefits are that serverless architectures have a lower code footprint since the cloud provider does most of the undifferentiated heavy lifting (e.g., networks, databases, gateways, etc.). And since functions are purpose-built for a small use case, it’s easier to keep track of them and ensure they have only the permissions they need.

On-demand billing

On-demand billing is a huge benefit that helps your bottom line. As we’ve mentioned, you only pay for what you use. If there is no function running or no data in a database, you don’t pay for it. Though one serverless execution might be more expensive than execution on traditional infrastructure that’s utilized to its full capacity all month, that scenario is rare, so it’s a good idea to move the risk associated with idling infrastructure to the cloud provider.

Why do serverless apps require security?

With its drastically increased number of resources, serverless technology is hard to monitor. You can have hundreds of functions that utilize dozens of databases and queues. While the functions are easy to observe, the services they utilize are not. A monitoring solution has to consider that. Otherwise, vulnerabilities might go under the radar until it's too late. 

Since every function can be an entry point to your system, you have to manage permissions for each, and the more functions you have, the more complex this can get. Let’s look at this in more detail, along with other common threats serverless architectures face.

Common serverless security threats

There are several reasons a serverless architecture might be susceptible to security vulnerabilities. Here are the seven most common:

1. Increased attack surface

Serverless architectures can consist of dozens, sometimes even hundreds, of small services that form a single application. This poses a risk for multiple reasons:

  • More services mean more cognitive load on the engineers that maintain them. While the managed nature of these services can lighten that load somewhat, issues might slip through the cracks if the number of services reaches a critical mass.

  • You can easily expose each serverless function to the public, creating an entry point into your system. It’s critical to keep every serverless function in check so they don’t become a liability.

  • Managing permissions for many services can become a full-time job if you don’t implement a reasonable process from the outset. Otherwise, giving each function full access is tempting when a deadline looms.

2. Event data injection

Injection vulnerabilities are the bane of every publicly exposed service. (Think JavaScript injections for HTML, SQL injections for relational databases, and event injection for serverless architectures.) 

If you integrate user-supplied parts of your event data in commands without sanitation, your app can be susceptible to an injection attack. For example, when you build a script via string concatenation and use a URL or filename from a user input, you’re introducing considerable risk.

3. Over-privileged functions

If you follow the principle of least privilege, your application will end up with air-tight permissions. In theory, the permissions a serverless function requires are easier to assess than with monolithic services. After all, functions are small and purpose-built. 

However, incomplete knowledge of available permissions or time pressures can lead an engineer to assign more permissions to a function than what’s required. And if a function was split because it grew too much, and each of the new functions only needs a fraction of the permissions of their ancestor, people tend to forget to update.

4. Compromised third-party code

While OS and runtime maintenance is your provider’s responsibility, you must choose libraries and frameworks and keep your application code up to date. Supply chain attacks are on the rise, and if you rely on third-party dependencies without checking their safety, you might install something that compromises your functions. 

5. Accidental state

Although serverless functions are commonly considered stateless, that’s not entirely true. When a cold-start happens, the runtime loads your function from scratch, but if your function hasn’t been idle long enough to be evicted and receives another event, the runtime will reuse an already loaded function. Everything you do in the code outside of a handler function will be cached and become a state that could be susceptible to a security vulnerability.

6. Side channel attacks

Using a VPC (a private network inside the cloud without direct public internet access), can lead to the false assumption that all services inside the VPC are secure because nobody can directly access the network from the outside. This assumption can lead people to give internal services more privileges than required. That’s a big problem: An external attacker has many tools in their arsenal, even if they don’t have direct access. And even peered VPCs can be at risk.

7. Billing attacks

Paying on demand is great, but there are downsides. A denial of wallet (DoW) attack is designed to take an app offline by racking up usage charges until the owner has no money to pay for it. According to OWASP, DoW attacks are more of a threat in serverless than DoS attacks.

A few simple serverless security best practices

Now that we understand the potential issues, let’s look at the top seven prevention methods:

1. Grant minimal permissions

Always follow the principle of least privilege. If a function doesn’t write to a service, don’t give it write access. If a function uses only one bucket, don’t give it access to all buckets.

2. Take advantage of API gateways

Don’t expose all of your serverless functions directly to the public; use an API gateway instead. That way, you have only one entry point into your application and can manage public access in a central location.

3. Employ command query responsibility separation

Split your functions into reading and writing functions. This makes the code footprint of each function smaller and easier to monitor, and if one of the two is compromised, the others are likely to be unaffected.

4. Scan your code

Follow the shift-left approach and solve your issues early in development. Utilize code security scanners that run in the IDE and your CI/CD pipeline to ensure you catch issues before they hit the cloud.

5. Prioritize monitoring, logging, and tracing

Use monitoring and observability tools and services in production. Otherwise, you’ll have no way to discover what went wrong when you got hacked. Monitoring, logging, and tracing are essential components of ongoing maintenance.

6. Secure function URLs

If API gateways aren’t an option and you must use function URLs, make sure to keep them secure. We’ve published a detailed guide on our blog, so check it out

Pro tip

Lambda function URLs may be simple, but like any other externally exposed resource in your cloud environment, it is important that they be properly secured. While functions residing behind an API gateway or load balancer rely on the secure configuration of these frontend services, function URLs must be independently secured, and misconfigured instances could pose an attractive target for malicious actors hoping to cause damage or gain access to sensitive data.

Learn More ->

7. Leverage agentless scanning

Every sufficiently complex system will include multiple technologies to complete its tasks. You might not be able to go 100% serverless for various reasons. But whether you’re completely serverless or partially serverless, it’s essential to have full visibility into your infrastructure. 

Wiz's approach to serverless security

Wiz is a unified cloud security platform that provides visibility and control over security risks in the cloud environment, including serverless architectures. Wiz assists with several serverless security use cases, including:

  • Detecting and analyzing serverless functions for various risk factors such as external exposure, identity and entitlements, misconfigurations, secrets, vulnerabilities, malware, and sensitive data

  • Mapping attributes associated with serverless functions, such as network settings and configurations, to calculate effective permissions and network exposure

  • Identifying and mapping libraries to corresponding vulnerabilities through analysis of compile time and runtime dependency files

  • Scanning function code for clear text secrets and sensitive data

Wiz can significantly enhance the security of serverless architectures by providing comprehensive visibility, identifying vulnerabilities and misconfigurations, ensuring compliance with security standards, detecting threats, and offering guidance to improve security posture.

Uncover Vulnerabilities Across Your Clouds and Workloads

Learn why CISOs at the fastest growing companies choose Wiz to secure their cloud environments.

Get a demo

Continue reading

Credential Stuffing Explained

Wiz Experts Team

Credential stuffing is a type of cyberattack where automated tools are used to repeatedly inject stolen username/password combinations into various services to gain access to legitimate users’ accounts in addition to those that were originally breached.

Container Orchestration

Container orchestration involves organizing groups of containers that make up an application, managing their deployment, scaling, networking, and their availability to ensure they're running optimally.

Native Azure Security Tools

Wiz Experts Team

This blog explores the significance of security in Azure environments and provides an overview of native as well as third-party security tools available to improve an organization’s Azure security stance.

Cross-site scripting

Wiz Experts Team

Cross-site scripting (XSS) is a vulnerability where hackers insert malicious scripts inside web applications with the aim of executing them in a user’s browser.