ChatGPT leaks personal info: Woman asks AI about her plant, gets SHOCKED with response, ‘Scariest thing seen’ – Crypto News – Crypto News
Connect with us
ChatGPT leaks personal info: Woman asks AI about her plant, gets SHOCKED with response, ‘Scariest thing seen’ ChatGPT leaks personal info: Woman asks AI about her plant, gets SHOCKED with response, ‘Scariest thing seen’

Metaverse

ChatGPT leaks personal info: Woman asks AI about her plant, gets SHOCKED with response, ‘Scariest thing seen’ – Crypto News

Published

on

ChatGPT, the generative artificial technology which recently made waves online for Ghibli style images is in the news again, but this time for erroneous output. A shocking response was generated when a user asked ChatGPT what is wrong with her plant. The frightening response the user received was someone else’s personal data.

Suggesting it to be the “scariest thing,” she saw AI do, in a LinkedIn post she stated, “I uploaded a few pics of my Sundari (peacock plant) on ChatGPT—just wanted help figuring out why her leaves were turning yellow.” Revealing that instead of giving plant care advice, ChatGPT provided her with someone else’s personal data. The response genearated, “Mr. Chirag’s full CV. His CA student registration number. His principal’s name and ICAI membership number. And confidently called it Strategic Management notes.”

Attached here are the screenshots of the conversation, the user claimed to have had with the chatbot:

Narrating the harrowing experience, Chartered Accountant Pruthvi Mehta in her post added, “I just wanted to save a dying leaf. Not accidentally download someone’s entire career. It was funny for like 3 seconds—until I realised this is someone’s personal data.”

Questing the overuse of AI technology, this post is doing the rounds on social media and has amassed over 900 reactions and several comments. Claiming it to be a counter reaction of ChatGpt, due to overuse for Ghibli Art, she posed the question, “Can we still keep Faith on AI.”

Check netizen reaction here

Strong reaction from internet users poured in as a user remarked, “I am sure the data is made up and incorrect! Pruthvi.” Another user commented, “This is surprising, since the prompt asked something entirely different.”

A third user wrote, “Wondering if these are real details of someone, or it’s just fabricated it. Nevertheless, seems a bit concerning, but looks more like a bug in their algorithms.” A fourth user replied, “I don’t see this is possible, unless the whole chat thread must have something in the link with this.”

Trending