Article URL: https://www.tomshardware.com/tech-industry/artificial-intelligence/researchers-jailbreak-ai-chatbots-with-ascii-art-artprompt-bypasses-safety-measures-to-unlock-malicious-queries Comments URL: https://news.ycombinator.com/item?id=39634162 Points: 16 # Comments: 0