These processes are used by DevOps practitioners to increase the velocity of software updates, improve the reliability of the release mechanism, and the quality and security of the product.
What is a CI/CD Pipeline?
A CI/CD pipeline is the terminology used in software development to describe the automation of the processes that enables the release of code changes from the development environment through to production. Building, testing, and promoting code through environments before eventual production release can be complicated and time-consuming, involving several different teams and toolsets. Automating the CI/CD pipeline makes sense given the repetitive tasks required for successful code deployment, as well as ensuring consistency in what could otherwise be a complex set of interdependent components.
CI/CD pipelines are essential to the ability of an organization to embrace DevOps and deliver services and applications more efficiently, and DevSecOps promotes a secure by design approach throughout the development lifecycle (SDLC). By conducting analysis and security activity in the early stages of the SDLC, vulnerabilities can be identified as early as possible which enables risk assessments as well as decisions around their mitigations, allowing vulnerabilities to be resolved efficiently before code is promoted to the next stage. This approach aligns itself better with the increased velocity and agility of DevOps, with which the old-fashioned approach of security being a quality gate at the end of the lifecycle is simply incompatible. Integrating toolsets into the CI/CD pipeline results in reduced SDLC friction, improved collaboration, and a more efficient path to production.
Containers and CI/CD Pipelines
Containerized applications may include exploitable vulnerabilities hidden deep in the lower levels of images, where change is infrequent and opportunities to discover such vulnerabilities are limited. The use of CI/CD pipelines to automate software deployment makes it easy to add tools into the process flow that enable code compilation, unit testing, the analysis of code and shift-left of security, as well as automating packaging into container images for deployment in containerized environments.
Scanning container images for vulnerabilities is vitally important, with many built using open-source components from public sources whose integrity cannot be relied upon. It is also common for container images to include misconfiguration, which may introduce vulnerabilities for future exploitation, and exposed secrets which may lead to the compromise of other applications and services.
DevSecOps methodology demands all container images be scanned as early as possible in the SDLC, with any image sourced externally subject to scanning upon download and any component scanned before its addition to the main code. Integrating scanning tools designed to inspect code as it is merged into the master branch, as well as in transit between environments, can help identify vulnerabilities early in the development lifecycle. Vulnerability scanning should be included in the pipeline as business as usual, ensuring the finished product’s integrity.
The recommended image re-scanning can be automated by the CI/CD pipeline to recheck containers after each and every release, to make sure they stay vulnerability-free.
Effective management of container vulnerabilities means mapping them to containers to ensure the proper targeting of mitigation effort, as well as properly expressing associated risks in terms of impact and likelihood. Simple steps may include the removal of unused components, upgrading or patching any out of date elements, and ensuring approved images, registries, and files, are used in the creation of container images. Least privilege should be applied at runtime to ensure processes and applications run with the minimum permissions that are necessary to function. This means that any successful exploit will be similarly limited owing to the restricted permissions it inherits from the process compromised.
By scanning container images in the CI/CD pipeline before deployment to the registry, exposed secrets and vulnerabilities can be spotted and mitigated before exposure to any associated risks occurs. Ongoing scans of container registries complement this by ensuring that no vulnerabilities have been introduced before deployment to a runtime environment. Furthermore, continuous scanning of images at runtime guarantees that container security vulnerabilities, malware, misconfiguration, and exposed secrets are remediated quickly.
Benefits of the CI/CD Pipeline
CI/CD underpins the DevOps methodology, software development and operations functions together to deploy applications. The benefits of this include:
Incremental changes: CI/CD enables more frequent, smaller changes, rather than the traditional approach of bundling up large bodies of change for release at the same time. Smaller changes are easier to produce, test, scan, and release, and any issues can be identified before code progresses through the lifecycle, resulting in faster deployments, more expedient remediation, and fewer issues post-launch.
Automatic isolation: In a traditional development model where a point release of a product may include hundreds of discrete fixes and enhancements, it may be very difficult to find which of the many new components introduced a functional issue or security vulnerability subsequently identified. CI/CD resolves this by its nature, by promoting the frequent release of small changes. Limiting the scope of problems in this manner makes their resolution quicker, easier, and cheaper.
Reduced Backlog: Using CI/CD to deliver frequent updates to products has two very important benefits for your backlog. Firstly, more frequent change schedules mean items being addressed more quickly. Secondly, automated scanning tools mean defects are detected early and don’t land on the backlog in the first place.
Continuous Improvement: Frequent updates and automated processes provide more feedback for technical teams. Build failures, design issues, and merge problems are all opportunities to make improvements to both process and product.
Reduced Costs: CI/CD pipelines enable process automation, which reduces tech-team time spent on deployment-related activities. It also reduces faults, as well as resulting in faster fault resolutions. Teams have more time, and product quality increases.
Reduced Risk: Automating security processes, and shifting left, results in the identification of security issues earlier in the life cycle, meaning they can be remediated at source and before they become vulnerabilities in production.
Automated testing: The automation of testing within the CI/CD pipeline enables continuous deployment, resulting in better quality products released more reliably, and with better baseline security, which results in reduced costs and improved customer satisfaction.
Increased velocity: Efficiencies in the software development lifecycle mean a shorter path to market for new applications and features, as well as opportunities for more frequent updates without a corresponding overhead on development and operations staff. Organizations utilizing CI/CD methodologies to deliver software can be responsive to customer demands, as well as their own interests, giving them an advantage over their competitors.
Efficiency and Consistency: The automation of integration and deployment tasks reduces the overhead on the team, enabling them to focus on higher quality outputs. For example, a security analyst can use the time they previously spent manually scanning code in support of a release to focus on zero-day exploits or future enhancements. The consistency of automation removes the possibility of human error. Again, cost to the organization is reduced, at the same time as a reputation for quality products delivered on time is built.
CI/CD pipelines began as scripted routines and Makefile, which grew over time to provide an increasing number of features. More recently, applications designed specifically to perform such functions have come to market, and have become extremely popular owing to their ability to simplify multiple streams of activity into one process umbrella. What began with on-prem offerings from the likes of Jenkins storing configuration data as flat files on servers accessed via user interface, has become the remote-repository driven pipeline as code approach favored by modern DevOps practitioners. Unsurprisingly, the major cloud service providers have their own product offerings in this space, enabling pipeline adoption alongside broader modernization of services to the cloud.
As you would expect, there are huge numbers of tools to choose from when building a pipeline, falling into the flowing service areas:
Code Repos: A source code repository is where code originates for build, test, and deployment. Popular examples of code repositories include GitHub, GitLab, and Bitbucket.
CI Tools: Used to automate tasks related to build, test, and deployment, this family of tools includes Jenkins, Bamboo, and Travis.
Quality: Tools offering code analysis ensure code meets the mark, with test failure reporting and code quality reporting, SonarQube is an example.
Repo Management: Artifact repository management optimizes binaries used in software development, centralizing storage of versioned files for use by the CI/CD pipeline in all environments. Such tools include Nexus and JFrog.
Configuration Management: Configuration management tools ensure consistency across environments, improving the reliability of code deployments. Examples of these tools include Ansible and Puppet.
Security:CI/CD pipeline security tools enable the scanning of VMs, containers, and images for misconfiguration, vulnerabilities, and exposed secrets, using consistent policy frameworks to ensure end-to-end visibility and control. Wiz offers tools in this category.
Communication: Communication tools within the pipeline enable efficient sharing of status messages, feedback, and task-related items to streamline the process. Many of the tools listed above include their own mechanisms for providing communications updates, but Slack is a popular choice.
CI/CD : How does it work?
CI/CD is essential to DevOps software delivery, enabling a repeatable means to deploy software products efficiently. A typical CI/CD approach utilizes a four-stage model.
Source: Running a pipeline is triggered by change in a source code repository, which generates a notification in CI/CD tooling and initiates the pipeline. The pipeline may also be triggered manually, or by other pipelines.
Build: Source code and dependencies are packaged into units appropriate to their purpose, be it compiled code ready to execute, a machine image ready to deploy or a container image ready for an orchestration service.
Test: Automated test tools validate code to ensure adherence to policy, quality standards, and expected behavior. Tests may be simple or complex, depending on the scale and complexity of the product or release, Issues identified at this stage can be rectified quickly, as the development team is engaged in the CI/CD process.
Deploy: Subject to the code passing all tests, including the scanning for security vulnerabilities that will have been running throughout the CI/CD process, it is ready for production deployment. This may happen via staging environments, and can be an automatic process of merging all approved changes from the master branch to production based on successful completion of all tests in the previous stage.
Securing the CI/CD Pipeline
CI/CD has provided the opportunity to integrate automated software tools into every stage of the software development life cycle resulting in products that are secure by design. Shift-left has meant that security considerations are uppermost in the minds of the whole team, with security tools available to developers and feedback from analysis and testing at every lifecycle stage empowering improvements throughout.
Source code should be scanned for vulnerabilities on download or creation, ensuring a secure start to the pipeline, with a repository appropriately secured and access controlled. The product should be scanned and assessed as it progresses between CI/CD stages as well as between physical environments, with any vulnerabilities categorized and risks captured for appropriate mitigation. Maintain tight control of where secrets are kept, and use a dedicated key management service to keep access to a minimum – ensuring that they are not exposed during build and never appear in source code.
The pipeline itself, and all tools within, should also be appropriately secured using role-based access and the principle of least privilege, ensuring the team have sufficient access to perform their roles, but minimizing the opportunity for a malicious actor to undermine the CI/CD process. The pipeline provides a great deal more logging and audit data than previous labor-intensive processes, making it easier to identify any unusual patterns of activity.
To address vulnerabilities and misconfigurations in the pipeline and resolve issues pre-deployment, you need a CI/CD security product to reduce threat surface area. You’re welcome to request a Wiz demo to find out more.