Skip to Content, Navigation, or Footer.
We inform. You decide.
Monday, April 13, 2026

DoctorGPT: AI’s rising role in Gainesville healthcare

Alachua County Fire Rescue, Shands Hospital and UF Health reflect on use

The UF Health Shands Emergency Room, Sunday, March 29, 2026, in Gainesville, Fla.
The UF Health Shands Emergency Room, Sunday, March 29, 2026, in Gainesville, Fla.

Read other stories from the "These stories were not AI-generated" special edition here.

Artificial intelligence has entered health care. Medical notes can now be transcribed with AI, diagnoses can be made on Google with the click of a button and ChatGPT is taking over the role of a traditional therapist for younger generations.

Still, some in the medical field across Alachua County say AI needs more research, largely due to privacy concerns.

Alachua County Deputy Fire Chief Jeff Taylor said the department uses AI for administrative tasks but remains wary of bringing the technology into patient care.

“Currently, we use tools that measure our effectiveness and response and help us to understand the types of calls that we run,” Taylor said. 

The county is taking a “conservative approach” to AI, using it only for note-taking, he said.

Taylor also noted that while there are concerns surrounding patient privacy and HIPAA laws with AI, Alachua County doesn’t put any patient info into a large language model, or LLM, that could potentially use that sensitive information.

Access to patient records has been a topic of debate since AI first crept its way into health care, a trend that was the topic of a roundtable with health officials given by Guidehouse. Since AI deals with sensitive information, it must be HIPAA compliant.

According to The HIPAA Journal, AI doesn’t have its own safety rules. Instead, it must follow the same rules that govern protected health information for humans or more traditional documentation systems.

Other local emergency responders echoed Taylor’s concerns. Derek Hunt, a clinical educator at UF Health Shands Hospital and flight paramedic, said he hasn’t seen AI being used much in the field due to its inaccuracy. However, some medical equipment, such as ultrasound machines, are beginning to use AI, Hunt said.

Hunt also said patients use chatbots for their symptoms “all the time.”

“Sometimes the information is good, and sometimes it isn't, and it's really a case-by-case basis,” Hunt said. “But everyone's going to Google everything before or while you're talking to them.”

Enjoy what you're reading? Get content from The Alligator delivered to your inbox

Research from Duke University published in February found chatbots tend to give advice that is medically correct but lacks context. LLMs can’t take patient histories or read between the lines with symptoms, researchers concluded.

Like Taylor and the Alachua County Fire Rescue, Hunt says UF Health has been proactive in protecting patient privacy and HIPAA laws even with the use of AI programs, saying that Shands uses rigorous procedures to make sure it’s not violating any laws.

At UF, AI is also slowly being integrated into coursework and patient care. The College of Public Health and Health Professions offers a certification in AI public health and health care. UF Health has hired more than 30 faculty members who specialize in AI across six UF Health colleges. 

As part of AI research, UF Health has also begun to use AI in operating rooms to analyze patient care and complications in the ICU. 

Dr. James Wesley, a primary care provider for the Student Health Care Center, said in an email to The Alligator he has patients come to him with AI health advice at least once a day.

Wesley also echoed earlier sentiments of AI not being a useful tool for patients to get medical advice from, saying chatbots have no safeguards and are unable to identify true crises.

Research published in the journal Nature Medicine found LLMs were able to pass medical exams but couldn’t correctly identify diseases in real-world settings. In an interview with the BBC, a co-author of the study, Dr. Rebecca Payne, explained people are using LLMs to act as a doctor even though the technology isn’t advanced enough yet.

Liyana Ahmed, an 18-year-old UF biology freshman, said she has done research on how AI is affecting health care, and she believes there are positive and negative implications of AI use within health professions. 

“I think that it can be very, very beneficial and help doctors with diagnoses of patients in a much faster and easier way,” Ahmed said. “But I also think that it should be used with caution and only if its validity can be proved.”

Angie Joseph, an 18-year-old UF biology freshman, also believes AI can be used for good in the medical field if humans check the results. However, she believes AI shouldn’t be used to get results for patient care.

"It's not a human at the end of the day,” Joseph said. “They are not going to have the same ethics that a human might have.”

Contact Nevaeh Baker Harris at nbakerharris@alligator.org. Follow her on X @nbakerharris.

Support your local paper
Donate Today
The Independent Florida Alligator has been independent of the university since 1971, your donation today could help #SaveStudentNewsrooms. Please consider giving today.

Nevaeh Baker Harris

Nevaeh Baker Harris is a first-year sports and media journalism major and The Alligator's Spring 2026 Student Government reporter. In her free time, she enjoys watching medical dramas, reading horror novels, and listening to 90s rock music.


Powered by SNworks Solutions by The State News
All Content © 2026 The Independent Florida Alligator and Campus Communications, Inc.