Shaked Rotlevi

Shaked Rotlevi artículos

What is LLM Jacking?

LLM jacking is an attack technique that cybercriminals use to manipulate and exploit an enterprise’s cloud-based LLMs (large language models).

What is a Prompt Injection Attack?

Prompt injection attacks are an AI security threat where an attacker manipulates the input prompt in natural language processing (NLP) systems to influence the system’s output.

AI Security Tools: The Open-Source Toolkit

We’ll take a deep dive into the MLSecOps tools landscape by reviewing the five foundational areas of MLSecOps, exploring the growing importance of MLSecOps for organizations, and introducing six interesting open-source tools to check out

¿Qué es Cloud Compliance (Cloud Compliance) ?

El cumplimiento de la nube es el conjunto de procedimientos, controles y medidas organizativas que debe implementar para garantizar que sus activos basados en la nube cumplan con los requisitos de las regulaciones, estándares y marcos de protección de datos que son relevantes para su organización.

Shaked Rotlevi Mensajes

Wiz at Re:Invent 2023

See what’s new with Wiz at Re:Invent 2023 and learn about how Wiz and AWS continue to strengthen their strategic partnership, keeping AWS customers’ environments secure.

Wiz at Re:Inforce 2023

See what is new with Wiz at Re:Inforce and learn about how Wiz and AWS continue to strengthen a strategic relationship to secure customers’ AWS environments.