Back
SEARCH AND PRESS ENTER
Recent Posts

Q&A with Dr. Hubner: Utilizing AI for Medical Student Success

I wrote my responses first, then asked both Claude and ChatGPT for “suggestions to reduce redundancy and improve clarity”. I then compared their outputs, selected what was useful, and revised my draft accordingly. In this way AI is the tool that helped me refine my draft.

01. How would you define “AI” in the medical education context?

AI is an umbrella term for tools that act intelligently by spotting patterns, analyzing data, or generating language. In the medical education context, AI typically refers to tools like GPT chatbots, adaptive learning systems, image recognition tools, or simulators that provide feedback. These are applied across med ed including admissions, teaching, assessment, and clinical training.

Q2. What advantages does utilizing AI create compared to not using it? 

AI can support teaching and learning when paired with human judgment. AI reduces repetitive tasks (ex., logging procedures or screening applications), helps flag bias, and speeds up literature searches. For learners, it supports personalized study with question generation, feedback, and multimodal content (ex. audio, visuals, mind maps).

Q3. Why do you recommend utilizing AI in medical school? How can students implement it? Does it serve as a tool? As a replacement? As a distractor?

Students should learn AI because it is shaping healthcare and academic medicine. AI is a  tool that enhances study and supports clinical reasoning. However, AI is not an easy button! It should not be a replacement for human judgment. If overused or used uncritically, it risks undermining learning.

I like to think of AI as another resource in the toolbox. The value for learners comes from how they use it to identify and begin to fill knowledge gaps; evaluate and question outputs, etc.

Q4. Are medical schools actively implementing AI into the curriculum? If so, how could/does it look like? Are there any obstacles associated with implementation?

Yes. The AAMC has dedicated resources for the work medical schools are doing in this area. https://www.aamc.org/about-us/mission-areas/medical-education/artificial-intelligence-and-academic-medicine

At UAB, the AI in Medicine Graduate Certificate gives learners foundations in AI applications, ethics, and integration into clinical workflows. UAB also hosts many AI-focused research labs across medicine, neurology, radiology, pediatrics, oncology, and microbiology.

One of our preclinical modules encourages students to generate and critique an AI response to questions on the student/faculty shared Q&A document. The goal is to help students practice critical evaluation and not just accept AI answers at face value and to engage in self-directed study rather than waiting for the professor’s response.

Challenges include lack of faculty training, data bias, hallucinations, ethical concerns, cost, uneven student access, and the need to preserve human judgment and empathy.

Q5. Is there any research being done that shows AI being more beneficial? From your experience, does using AI seem to have a significant impact on performance?

Yes. Studies show AI models can fairly and accurately screen applications, assess surgical skill, grade OSCEs, and predict exam performance. Virtual patient simulators and tutoring systems improve clinical reasoning. Large language models already pass some medical exams, and AI-generated multimodal content appears to enhance retention. Generative AI can turn dense text into study aids that support retention. Long-term impacts, however, are still unclear.

Q6. Even though there are many advantages to AI, are there any situations where using AI should be limited? Are there any limitations to AI?

AI lacks human reasoning, context, and compassion. Limitations include hallucinations, data bias, unclear processes, over-reliance, and risks to privacy. AI wants to please the user and responses on the same topic degrade when the user tries to dig deeper. AI also isn’t as strong when questions need really specific, up-to-date, or highly specialized information. It can make mistakes especially with images or data that aren’t text-based.

For students, AI cannot replace failure, risk, and emotional feedback which are key elements of learning.

Q7. What is the wrong way to utilize AI? (if there is)

  • As the “easy button”
  • Blindly trusting outputs.
  • Using it to write assignments or replace reasoning.
  • Generating assessments without review.
  • Ignoring ethics, privacy, or bias.

In sum, using AI as a substitute for your own thinking instead of using AI as a support tool.

Q8. What skills should students have before approaching AI? How can you encourage critical thinking with a tool that quickly formulates an answer?

Students need basic AI literacy, critical analysis, information literacy, and awareness of ethics and bias. We can promote:

  • Self-directed learning skills: setting goals, monitoring progress, and evaluating when AI is helping versus hindering.
  • Metacognition: reflecting on how they’re learning and how reliable the AI outputs are.
  • Critical analysis: questioning accuracy and identifying bias.
  • Ethical awareness: recognizing privacy and professionalism concerns.

Educators can help by asking students to critique AI answers, develop assignments and exams that promote integration over recall, and model their own reflective process (ex. “Here’s how I check if this answer makes sense”).

Q9. What are your recommended sources of AI that students can use? What are the best ways to utilize these tools?

  • ChatGPT, Gemini, AMBOSS GPT, NotebookLM: for summaries, practice questions, multimodal study aids.
  • OpenEvidence: for evidence-based clinical decision support.
Brook A. Hubner, Psy.D

Brook A. Hubner, Psy.D.

Dr. Brook A. Hubner is an Assistant Professor in the UAB Heersink School of Medicine Department of Medical Education and Director of the Academic Success Program. An educational psychologist, she combines practitioner expertise in curriculum development with research on learning science and educational technology. She develops programs that strengthen medical students’ self-regulated learning and resilience and that support their professional identity development.