Jailbreakbench is an open-source robustness benchmark for jailbreaking large language models (LLMs). The goal of this benchmark is to comprehensively track progress toward (1) generating successful ...
Jailbreak takes the classic children's game of cops and robbers and brings it to the next level by setting it in a massive ...