Background: This study evaluated the performance of ChatGPT-4 Omni (ChatGPT-4o) in answering multiple-choice questions from the Dentistry Specialty Examination (DUS), a nationwide exam conducted in Türkiye, assessing knowledge in basic medical and clinical dentistry sciences. Additionally, it examined performance variations based on question language (Turkish vs. English).
Methods: The dataset included 1504 unique questions from publicly available DUS exams (2012-2021) categorized into Basic Medical Sciences (n =514) and Clinical Dentistry Sciences (n = 990). Each question was presented to ChatGPT-4o in both Turkish and English, generating 3008 responses. Accuracy was determined using the official answer key. McNemar’s test compared accuracy between languages, while chi-square and Bonferroni post-hoc tests assessed differences across disciplines.
Results: ChatGPT-4o showed significantly higher accuracy for English questions (87.8%) than Turkish questions (84.0%) (P < .001). In Basic Medical Sciences, accuracy was significantly higher for English questions in Anatomy (P = .004) and Physiology (P = .039), while Biochemistry achieved 100% accuracy in both languages. In Clinical Dentistry Sciences, English responses were significantly more accurate in Periodontology (P = .013), Endodontics (P = .003), and Pediatric Dentistry (P = .005), whereas Turkish responses performed better in Maxillofacial Radiology (P = .013). The highest error rates were in Prosthetic Dentistry (20.1%) for English and Endodontics (18.3%) for Turkish.
Conclusion: ChatGPT-4o demonstrated high accuracy in DUS exam questions, with English responses generally outperforming Turkish ones. Performance varied across disciplines, indicating potential language-based limitations. These findings highlight large language models (LLMs)’ potential for dental education while underscoring the need for improvements in language processing and discipline-specific knowledge.
Cite this article as: Dündar Sarı MB, Sezer B. ChatGPT-4 Omni’s Accuracy in Multiple-Choice Dentistry Questions: A Multidisciplinary and Bilingual Assessment. Essent Dent. 2025, 4, 0029, doi: 10.5152/EssentDent.2025.25029.