Jailbreaking LLM-Controlled Robots
December 11 2024Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.
Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.