People are tricking AI chatbots into helping commit crimes
24 May 2025 at 01:00
A universal jailbreak for bypassing AI chatbot safety features has been uncovered and is raising many concerns.

Β© sarayut Thaneerat/ via Getty Images