Cyber Defense Advisors

Jailbreaking LLM-Controlled Robots

Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.

Sidebar photo of Bruce Schneier by Joe MacInnis.

 

Leave feedback about this

  • Quality
  • Price
  • Service
Choose Image