Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions. Tags: hacking, LLM, robotics, social engineering Sidebar photo of Bruce Schneier by Joe
Leave feedback about this