ChatGPT helped a mother determine what was causing her son’s debilitating pain that had been ongoing for three years after 17 Doctors had failed to diagnose the cause of his chronic pain. Here’s the details of this miracle.
Symptoms Started at 4 Years Old

Courtney, who chose not to reveal her last name, told Today that her son Alex had symptoms when he was four during the COVID-19 lockdown.
Gigantic Meltdowns

The family’s nanny “started telling me, ‘I have to give him Motrin every day, or he has these gigantic meltdowns,’” Courtney said. The painkillers helped to subdue her child’s pain, but other worrisome symptoms started popping up.
Chewing on Things

Alex began chewing on objects, which caused his family to wonder if he had a cavity. A dentist examined him and didn’t find anything wrong; Alex was referred to an orthodontist who found his palate too small.
Trouble Sleeping

This can cause trouble sleeping, and Alex’s family thought this might be part of why he hadn’t been feeling well.
The orthodontist treated Alex by placing an expander in his palate, temporarily putting his family at ease.
“Everything was better for a little bit. We thought we were in the home stretch,” Courtney said.
He Stopped Growing

But Alex continued to suffer: Courtney soon noticed her son had stopped growing.
Bringing His Left Foot Along for the Ride

She also noticed he wasn’t walking as he should have been. “He would lead with his right foot and just bring his left food along for the ride,” she said.
Severe Headaches and Exhaustion

He was also experiencing severe headaches and exhaustion.
Saw 17 Doctors in Total

Courtney’s family saw many experts to figure out what was wrong with Alex, including a pediatrician, a neurologist, an ear, nose, and throat (ENT) specialist, and more.
Courtney said they consulted 17 doctors but were left frustrated and without answers. None of the recommended treatments solved the problem.
ChatGPT to the Rescue

After three years of doctors’ appointments, Courtney turned to ChatGPT for answers. The chatbot, created by OpenAI in 2022, was made to “talk” with people in a conversational way.
Line By Line

“I went line by line of everything that was in his MRI notes and plugged it into ChatGPT,” Courtney said. “I put the note in there about how he wouldn’t sit crisscrossed…To me, that was a huge trigger that a structural thing could be wrong.”
A Miracle Diagnosis

ChatGPT led Courtney to discover tethered cord syndrome, which is a complication of spina bifida. She made an appointment with a new neurosurgeon who confirmed that Alex had a tethered spinal cord due to spina bifida occulta.
This congenital disability causes issues with spinal cord development.
Hidden Spina Bifida

This is the mildest form of spina bifida, per the Centers for Disease Control and Prevention (CDC), which states the condition is sometimes called “hidden” spina bifida and is often not discovered until later in a child’s life.
Every Emotion in the Book

The new doctors “said point blank, ‘Here’s [occulta] spina bifida, and here’s where the spine is tethered,” Courtney recalled. She subsequently said she felt “every emotion in the book, relief, validation, excitement for his future”.
Social Media – Dr’s Think They Are Gods

“ChatGPT reveals the lack of diagnostic skills in doctors who frequently think of themselves as Gods” one user posted.
Incredible

A second simply declared, “ Wow! Incredible”
Doctors Will Have to Consult with AI

“Insane. I have a feeling this type of thing will start happening more and more often. Soon, doctors will probably be required to consult with AI.” was another reader’s opinion.
Can’t Afford Healthcare

A unique viewpoint came from one user on X, who said, “People who can’t afford health care and those in poorer countries will greatly benefit from this.”
Human Doctors Suck

“Personally I look forward to AI Drs. The human ones suck now.” one frustrated reader wrote.
ChatGPT Not There Yet

Experts agree that ChatGPT could help certain people navigate the healthcare system, but it’s not there yet.
As Jesse M. Ehrenfeld, MD, MPH, president of the American Medical Association (AMA) said in a statement to Today, “While AI products show tremendous promise in helping alleviate physician administrative burdens and may ultimately be successfully utilized in direct patient care, OpenAI’s ChatGPT and other generative AI products currently have known issues and are not error free.”
He explained that AI can produce fabrications, inaccuracies, or errors that “can harm patients”.
Be Your Kid’s Advocate

For her part, Courtney said the technology helped her family provide the best possible care for her son. She told Today: “There’s nobody that connects the dots for you. You have to be your kid’s advocate.”
His In-Laws Gave Him a House as a Wedding Gift…but in the Most Insulting Way Possible

A young man whipped up concern and advice among Redditors when he revealed that he refused to live in the house his future in-laws wanted to buy for him and his bride. The problem was that the gift came with all sorts of strings. Here is the full story. His In-Laws Gave Him a House as a Wedding Gift…but in the Most Insulting Way Possible
She Refused to Pay for Another Woman’s Child to Have a Nanny

A young mother took to Reddit for opinions after she refused to pay a nanny to look after her son’s friend. It’s tough to share a good thing sometimes! Here is the full story. She Refused to Pay for Another Woman’s Child to Have a Nanny
Her Parents Declared They Were Moving Into Her House, but She Said They Should Talk to Her Lawyer

A young woman riled up Redditors and landed some kudos after she told her parents she was going to put them in a home. The backstory that follows is loaded with family drama and bad assumptions. You definitely don’t want to miss out on this one! Her Parents Declared They Were Moving Into Her House, but She Said They Should Talk to Her Lawyer
Her Manager’s IMPOSSIBLE Demands Were Proven to HR in a Spectacular Case of Malicious Compliance!

Casey’s tale of workplace struggle took an unexpected turn when she sought revenge on their demanding Department Manager. After documenting the unbearable workload, Casey’s actions lead to a surprising twist of events, leaving Hannah to face the consequences of her unfair demands. Her Manager’s IMPOSSIBLE Demands Were Proven to HR in a Spectacular Case of Malicious Compliance!
She Banned Her Housekeeper From Working at Her Mother-In-Law’s House

A stressed-out mom took to Reddit for opinions after she refused to share her housekeeper with her mother-in-law. The problem was they were sharing something else, too. Here is the full story. She Banned Her Housekeeper From Working at Her Mother-In-Law’s House
The post An AI Chatbot SOLVED This Mother’s Medical Mystery, Even After 17 Doctors Were Stumped – What’s Next for the Smart Machine Revolution? first appeared on Career Step Up
Featured Image Credit: Shutterstock / Tada Images. The people shown in the images are for illustrative purposes only, not the actual people featured in the story.
Source: today