Estimating the Prevalence of Generative AI Use in Medical School Application Essays

Abstract

Background: Generative artificial intelligence (AI) tools became widely available to the public in November 2022. The extent to which these tools are being used by aspiring medical school applicants during the admissions process is unknown. Methods: We retrospectively analyzed 6,000 essays submitted to a U.S. medical school in 2021-2022 (baseline, before wide availability of AI) and in 2023-2024 (test year) to estimate the prevalence of AI use and its relation to other application data. We used GPTZero, a commercially available detection tool, to generate a metric for the likelihood that each essay was human-generated, P_human, ranging from 0 (entirely AI) to 1 (entirely human). Results: Fully human-generated negative controls demonstrated a median P_human of 0.93, while AI-generated positive controls demonstrated a median P_human of 0.01. Personal Comments essays submitted in the 23-24 cycle had a median human-generated score of 0.77 (95% confidence interval 0.76-0.78), versus 0.83 (95% CI 0.82-0.85) during the 21-22 cycle. Approximately 12.3 and 2.7% of essays were evaluated as having P_human < 0.5 in the test and baseline year, respectively. Secondary essays demonstrated lower P_human than Personal Comments essays, suggesting more AI use. In multivariate analysis, younger age, visa requirement, and higher GPA were significantly associated with lower P_human. No differences were observed in gender, MCAT score, undergraduate major, or socioeconomic status. P_human was not predictive of admissions outcomes in uni- or multivariate analyses. Conclusions: An AI detection algorithm estimated significantly increased use of generative AI in 2023-2024 medical school admission applications, as compared to the 2021-2022 baseline. Estimated AI use demonstrated no significant differences in admissions decisions. While these results provide information about the applicant pool as a whole, AI detection is imperfect. We recommended exercising caution before deploying any AI detection tools on individual applications in live admissions cycles.

Competing Interest Statement

The authors have declared no competing interest.

Funding Statement

The study was funded using the senior author's discretionary departmental funds.

Author Declarations

I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.

Yes

The details of the IRB/oversight body that provided approval or exemption for the research described are given below:

The Human Research Protection Office at Washington University School of Medicine determined that this study does not represent human subjects research (202401024). Moreover, permission was obtained from the Association of American Medical Colleges to use AMCAS data for this study.

I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.

Yes

I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).

Yes

I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.

Yes

留言 (0)

沒有登入
gif