Ever since ChatGPT took the world by storm late in 2022, AI chatbots have given some users the creeps. According to a new study, there is a good reason for that, and the scary stuff may be just beginning. Here is the full story.
This Can’t Be Real!

When OpenAI launched ChatGPT, they sent shock waves around the world with an AI experience that seemed like science fiction at the time.
Digital Know-It-All

Whatever questions users could throw out, it seemed like ChatGPT had an answer, and usually a good one.
A Year of Progress

And, while the following months showed that AI chatbots still had some limitations, improvements and ChatGTP competitors have followed in rapid succession.
A Brand New World

Today, AI chatbots are already part of just about every facet of life, or at least they’re in the conversation.
People have built new careers and fortunes using AI in just the last several months.
The Coolest Creep in Town

But despite how fascinated the world seems to be with chatbots these days, there has always been an underlying creep factor to the emerging AI.
We’re Not Worthy

The ease and confidence with which ChatGPT and others answer questions from humans, and the accuracy they can unleash in writing code and producing grammatically correct text, can be unsettling.
Now There’s Proof

Now, researchers at ETH Zurich, a Swiss state science school, say they have found even more evidence for the idea that AI chatbots are doing extraordinary things.
They Can Figure Us Out

In particular, the scientists say that today’s AI can infer details about users based on the nuances of the conversations they have with the chatbots
Chipping Away at Privacy

Their findings suggest that AI might represent a whole new threat to online privacy and safety because of their powerful ability to figure out information based on what our conversations “teach” them.
Their ‘A’ Students

For instance, the researchers at ETH Zurich found that OpenAI’s GPT-4 was able to accurately guess private details about Reddit users based on their posts about 90 percent of the time.
A Hard Commute

One example the scientists cite is a user who mentioned a particular intersection that gave them trouble on their commute.
Often, the user said, they’d execute a “hook turn” to get out of the jam.
They Know Where We Are

According to the researchers at ETH Zurich, “hook turn” is a term used nearly exclusively in Melbourne, Australia, which the chatbot correctly pegged as the user’s location.
Just Getting Started

But the scientists fear, and are seeing evidence in their studies, that AI chatbots can use conversations with humans to glean even more sensitive information.
They’re Learning Together

A member of the research team, who is a Ph.D. student, gave Wired a hypothetical example from their study.
Racial Predictor

If a user mentioned a favorite restaurant during a chatbot session, says Mislav Balunović, AI can figure out where that restaurant is.
Then, by examining demographic data it has access to, it can make a pretty strong assumption about the user’s race.
No Such Thing as Private

Coupled with the recent news of shared Bard chats appearing in Google search results, this new study makes it clear that your harmless gab session with an AI chatbot may not be quite as anonymous as you think.
At Least Our Gut Was Right

The good news for humans is that at least we were right to feel a little creeped out by chatbots all along
Jesus Was WHAT??? A Bishop’s STARTLING Claims About Jesus’ Sexuality Will Have You Reeling

The internet has been divided by a Bishop from England who has made a stunning claim about Jesus’ sexuality, using verses from the Bible to back up his points and fight against homophobia. Jesus Was WHAT??? A Bishop’s STARTLING Claims About Jesus’ Sexuality Will Have You Reeling
Generous Sister Foots the Bill for Brother’s Dream Wedding, Only To Discover She’s Been Deceived and Sidelined All Along!

A helpful sister took to Reddit for opinions when her brother cut her out of helping plan his wedding. He was still good with her paying for the whole thing, though! Here is the full story. Generous Sister Foots the Bill for Brother’s Dream Wedding, Only To Discover She’s Been Deceived and Sidelined All Along!
Mexico’s Aliens Have FINALLY Been Examined by Scientists Who Confirm What We’ve All Been Waiting For

Mexico’s infamous alien mummies have finally been examined by scientists who have each made important statements about the extraterrestrial findings. Mexico’s Aliens Have FINALLY Been Examined by Scientists Who Confirm What We’ve All Been Waiting For
An Elderly Woman Tried to STEAL Her First-Class Seat! The Internet Loves How She Responded

One woman bought an expensive train ticket in a first-class seat, only for it to be a priority seat. When an elderly woman asked her to give up the seat, she refused before being praised on the internet for her actions! An Elderly Woman Tried to STEAL Her First-Class Seat! The Internet Loves How She Responded
His In-Laws Gave Him a House as a Wedding Gift but in the Most Insulting Way Possible

A young man whipped up concern and advice among Redditors when he revealed that he refused to live in the house his future in-laws wanted to buy for him and his bride. The problem was that the gift came with all sorts of strings. Here is the full story. His In-Laws Gave Him a House as a Wedding Gift but in the Most Insulting Way Possible
The post Shocking Study Reveals Chatbots Can Pinpoint Your Race & Location From Your Typing Style first appeared on Career Step Up.
Featured Image Credit: Shutterstock / Kateryna Onyshchuk. The people shown in the images are for illustrative purposes only, not the actual people featured in the story.