AI-powered apps and bots are rapidly entering medicine, raising questions among doctors

Artificial intelligence is shaking up industries from software and law to entertainment and education.

Screenshot 2026-03-14 191517
Photo: REUTERS/John Emerson

“When someone is distressed and unguided,” Aksoy said, an AI chatbot “just drags them into this forest of knowledge without coherent context.”

A spokesperson for OpenAI said its newest models have significantly improved how they handle health questions. ChatGPT isn’t intended as a substitute for a medical professional’s guidance, the company said.

“When someone is distressed and unguided, [an AI chatbot] just drags them into this forest of knowledge without coherent context” – Dr. Cem Aksoy

The young Turkish patient’s encounter with AI-dispensed medical wisdom comes as many patients around the world are turning to the technology for advice. In addition to the big ask-me-anything chatbots, consumers are turning to a slew of new, AI-powered consumer medical apps.

A growing number of mobile apps available on the Apple and Google app stores claim to use AI to assist patients with their medical complaints – even though they’re not supposed to offer diagnoses.

Under U.S. Food and Drug Administration guidelines, AI-based medical apps don’t require approval if they “are intended generally for patient education, and are not intended for use in the diagnosis of disease or other conditions.” Many apps have disclaimers that they aren’t a diagnostic tool and shouldn’t be used as a substitute for a physician. Some developers seem to be stretching the limits.

An app called “Eureka Health: AI Doctor” touted itself as “Your all-in-one personal health companion.” It stated on Apple’s App Store that it was “FOR INFORMATIONAL PURPOSES ONLY” and “does not diagnose or treat disease.”

But its developer, Sam Dot Co, also promoted the app on a website, where it stated in big letters: “Become your own doctor.”

“Ask, diagnose, treat,” the site stated. “Our AI doesn’t just diagnose – it connects you to prescriptions, lab orders, and real-world care.”

Apple said that after learning about Eureka Health from Reuters, it removed it from its app store. Apple’s guidelines for developers states that medical apps “must clearly disclose data and methodology to support accuracy claims.”

App developer Sam Dot didn’t respond to a request for comment. But the website changed after Reuters inquired about it. The site no longer mentions the app.

In some cases, apps have given inaccurate and potentially dangerous advice.

“AI Dermatologist: Skin Scanner” says on its website that it has more than 940,000 users and “has the same accuracy as a professional dermatologist.” Users can upload photos of moles and other skin conditions, and AI provides an “instant” risk assessment. “AI Dermatologist can save your life,” the site claims.

Its Lithuania-based developer, Acina, says the app uses “a proprietary neural network” that looks for patterns to make predictions. Acina says it was trained on dermatological images to recognize specific skin conditions.

The app claims “over 97% accuracy.” But it has drawn hundreds of one-star reviews on app stores, and many users complain it’s inaccurate.

Daniel Thiberge, a tech-support analyst in New Jersey, told Reuters that he bought the app to interpret seven pictures he snapped of a small growth on his arm. Six results showed there was a “75%-95%” risk it was cancerous, he said. He then went to a dermatologist. The doctor told him the growth didn’t look problematic in any way, and it wasn’t worth doing a biopsy.

“If it’s completely, wildly off, what is the purpose of the app?” Thiberge asked. At best it’s useless, he said. “At worst, it’s dangerous, because you may not go see a dermatologist.”

In another review on the Apple App Store, a user wrote that to test the app, she uploaded photos showing she had melanoma, a serious form of skin cancer, that had been diagnosed and surgically removed. But the app reported that the condition was “benign,” wrote the user. She told Reuters that she fears “some people will trust it and delay doctor visits.”