
By Michaela Gordoni
If you’ve got money questions, you might want to consult an advisor instead of AI because ChatGPT incorrectly answered 35% of financial questions it was asked in a new test.
Researchers asked ChatGPT 100 finance questions and had them reviewed by industry experts. A couple of the questions they asked were “How [do I] save for my child’s education?” and “How does the average pension compare to the average salary?” per Entrepreneur.
About 29% of the answers were labeled incorrect or misleading, and 6% were outright false.
“ChatGPT has well-recognized issues with up-to-date information,” said Pedro Braz, CEO of Investing in the Web. “[It’s] best to go to the very source of the information, rather than asking AI chatbots for [financial] data.”
Amidst the misleading advice, many Americans (47%) report they’ve gone to AI for finance advice or are considering doing so. Among those who have, 97% reported that the AI was helpful.
Andrew Lo, director of the laboratory for financial engineering at the MIT Sloan School of Management, said it’s dangerous to get financial, legal and medical advice from AI.
“All three of those areas have very large dangers if they’re not done well,” he told Fortune earlier this year.
A recent study on ChatGPT and health had the AI tool examine 150 patient cases. It was able to give an accurate diagnosis less than half (49%) of the time, Live Science reported.
“If people are scared, confused, or just unable to access care, they may be reliant on a tool that seems to deliver medical advice that’s ‘tailor-made’ for them,” said study author Dr. Amrit Kirpalani, a doctor in pediatric nephrology at the Schulich School of Medicine and Dentistry at Western University, Ontario.
“I think as a medical community (and among the larger scientific community) we need to be proactive about educating the general population about the limitations of these tools in this respect,” he continued. “They should not replace your doctor yet.”
A 2024 study published in Frontiers in Radiology observed that patient privacy is also a concern.
“For example, patients seeking guidance for understanding a radiology report could unwittingly disclose sensitive health data like a past diagnosis of breast cancer or a family history of genetic disorders,” the study authors write. “Potential security vulnerabilities in the wider AGI system could expose this sensitive data to misuse or unauthorized access.”
Whether it’s financial, legal or medical advice, it looks like we’re better off consulting professionals — AI is just not there yet.
Read Next: TikToker Warns Against Medical Advice Circulated on Platform