0
Back to Projects

AI Jailbreak Prompt Tester

ai2026·AI Security Analyst

A categorized directory of high-risk 'jailbreak' prompts designed for AI red-teaming. Includes Development Mode bypasses, DAN (Do Anything Now) roleplays, and system prompt extraction attacks. Click to copy directly to clipboard.

Prompt EngineeringCybersecurityReactClipboard API

The Challenge

Organizing complex and highly specialized AI prompt injections into an accessible, fast-to-read dashboard for immediate security testing.

The Approach

Categorized payloads by attack vector (Extraction, Roleplay, Execution) and assigned risk levels. Built a split-pane 'Terminal' style UI that allows for instant scanning and 1-click payload copying.

The Result

An essential tool for anyone deploying LLMs in production to immediately test their application against known prompt-injection vulnerabilities.