15
votes
Researchers explain that it is easy to redirect LLM equiped robots, including military and security robots in dangerous ways
Link information
This data is scraped automatically and may be incorrect.
- Title
- It's Surprisingly Easy to Jailbreak LLM-Driven Robots
- Published
- Nov 11 2024
- Word count
- 1003 words