
News Summary
- Bing Chat refuses to solve CAPTCHAs which are visual puzzles designed to prevent automated programs bots from filling out forms on the web.
- X user Denis Shiryaev devised a visual jailbreak that circumvents Bing Chat s CAPTCHA filter by tricking it into reading the inscription on his imaginary deceased grandmother s locket.
- The additional information throws off the AI model which answers questions by homing in on knowledge in encoded latent space which is a vectorized web of data relationships built from its initial training data set.
Enlarge/ The image a Bing Chat user shared to trick its AI model into solving a CAPTCHA.33 with Bing Chat, an AI chatbot from Microsoft similar to ChatGPT, allows users to upload images for the A [+4214 chars]