Last spring, Makena, then a high school senior, was deep into cranking out some 70 essays for 20 college applications when her creativity started to wane.
So, she turned to a high-tech brainstorming partner: artificial intelligence.
One essay prompt asked Makena to describe a class she鈥檇 want to teach if she were a college professor. 鈥淚 had no idea,鈥 said Makena, who asked to be identified only by her first name to speak candidly about the admissions process. 鈥淚 had never thought about it.鈥
She put her intended major and some favorite topics into an AI tool, which spit out a list of potential courses. Makena selected one and crafted her essay around it, without any further AI assistance.
In Makena鈥檚 mind, this wasn鈥檛 cheating.
鈥淚 wrote my own essays, 100 percent,鈥 she said. After all, she could have found the same information on Google or by picking up a course catalogue. AI was just more efficient.
About a third of high school seniors who applied to college in the 2023-24 school year acknowledged using an AI tool for help in writing admissions essays, released this month by , an organization focused on improving learning.
About half of those students鈥攐r roughly one in six students overall鈥攗sed AI the way Makena did, to brainstorm essay topics or polish their spelling and grammar. And about 6 percent of students overall鈥攊ncluding some of Makena鈥檚 classmates, she said鈥攔elied on AI to write the final drafts of their essays instead of doing most of the writing themselves.
Meanwhile, nearly a quarter of students admitted to Harvard University鈥檚 class of 2027 .
The use of outside help, in other words, is rampant in college admissions, opening up a host of questions about ethics, norms, and equal opportunity.
Top among them: Which鈥攊f any鈥攐f these students cheated in the admissions process?
For now, the answer is murky.
Colleges . But they are mostly silent on how AI can be used in crafting essays.
That鈥檚 created 鈥渢his ethical gray area that students and [high school] counselors don鈥檛 have any guidance鈥 on how to navigate, said Jennifer Rubin, a senior researcher at foundry10 and the lead author the report.
A 鈥榙ouble standard鈥 on college admissions
Generative AI tools like ChatGPT have put a high-tech twist on decades-old questions of fairness in the college admissions process.
The system has 鈥渘ever been a level playing field,鈥 Rubin said, citing the advantages that mostly benefit wealthier students, such as SAT tutors, paid college admissions coaches, and savvy, college-educated parents. 鈥淚 think [AI] is just complicating it a little bit more because it鈥檚 a tool that鈥檚 readily available to everyone.鈥
To get a sense of the public鈥檚 perceptions on AI in college admissions, foundry10 included an experimental portion in its survey.
Participants reviewed an identical portion of a college essay. But one group was instructed that the applicant had help from ChatGPT in brainstorming ideas, refining content, and polishing the final draft鈥攅ssentially the same tasks Makena used AI for.
Another group was told the applicant got assistance with the same parts of the writing process, from a paid college admissions coach. A third group was informed that the student worked entirely alone.
Participants rated the applicant who used ChatGPT as less authentic, less ethical, and less likable than the student who paid for professional help. (The student who worked solo got the highest ratings.)
Rubin perceives a 鈥渄ouble standard鈥 at work.
A student who can pay 鈥渢housands of dollars to someone who has the knowledge of how a [particular college] works and what鈥檚 needed or wanted in a college admissions essay is going to have an undue advantage,鈥 she said.
College admissions coaching services typically cost from $60 to $349 per hour, according to data cited in Rubin鈥檚 report from
The website of one such service, , advertises its Harvard connections. For between $1,500 and $4,800, depending on the number of applications, students receive help in brainstorming topics and 鈥渆xtensive written notes, comments, and guidance, focusing on both content and structure,鈥 according to the site.
鈥淲e go back and forth as many times as needed until we have a very strong and solid Ivy League college application!鈥 the company promises.
Assistance from ChatGPT on similar tasks 鈥減robably isn鈥檛 going to be as strong鈥 as what such a service offers, Rubin said. 鈥淏ut it might provide students some form of feedback that they might not be able to get in their lives because they don鈥檛 have parents or caregivers鈥 who have the savvy to help.
These issues are especially personal for Rubin, a first-generation college graduate who attended a private high school on scholarship. She had the help of her school counselors in applying to college.
But that assistance couldn鈥檛 make up for the gap between Rubin and many of her peers with highly educated parents, who could offer all sorts of support, she said.
Big questions on AI use go mostly unanswered by colleges
For now, high school counselors aren鈥檛 sure what to tell their students when it comes to how AI can be ethically used in the admissions process.
鈥淢y seniors have come to me and said, 鈥楬ey, I鈥檝e got to write an essay about this. Where do I even start?鈥 Or 鈥榠s it OK if I use ChatGPT?鈥欌 said Melissa Millington, a school counselor in Missouri. 鈥淚 just really hit on, you cannot pass that off as your own work, because that鈥檚 not ethical.鈥
But, like Rubin, she sees some possibility for the technology in crafting applications that stop short of making AI a sole, uncredited ghost writer.
鈥淚f you are going to use it to get a starting point, that鈥檚 totally fine,鈥 she said she鈥檚 told students. 鈥淥r if you want to write your essay, and then put it in there and ask it to clean [the] grammar,鈥 that鈥檚 likely fair game.
While most colleges and universities are silent on the AI issue, some individual institutions have given applicants the green light to use AI in a limited fashion.
One of the country鈥檚 most prestigious institutions focusing on science, math, engineering and technology, CalTech, that it鈥檚 unethical to copy and paste an essay written entirely by generative AI. But it is acceptable to use AI to brainstorm or check grammar and spelling, the college says.
Georgia Polytechnical Institute, another highly regarded STEM-focused university, .
鈥淚f you choose to utilize AI-based assistance 鈥 we encourage you to take the same approach you would when collaborating with people,鈥 the school鈥檚 website says. 鈥淯se it to brainstorm, edit, and refine your ideas.鈥
But for other colleges, any use of AI is unacceptable, at least officially. Brown University, for instance, cites its fraud policy and tells applicants that the use of
鈥業t always been an honor system鈥
Brown and other institutions have no real way of enforcing those policies, Rubin said.
AI detectors are notoriously unreliable. And they are disproportionately likely to flag writing by students who are not native English speakers, even if they didn鈥檛 use AI.
In fact, Kristin Woelfel, a policy counsel specializing in equity in civic technology for the Center for Democracy & Technology, a nonprofit organization that aims to shape technology policy, has gone so far as to say the detectors have the potential to violate students鈥 civil rights.
It doesn鈥檛 really matter if colleges have guidelines that prohibit AI use, Rubin said, because there鈥檚 no way to check on what kind of assistance an applicant received, human or not.
鈥淚t鈥檚 always been on the honor system,鈥 she said.
Colleges that haven鈥檛 outlined their policies on AI in the application process are ignoring the obvious鈥攁nd making life harder for high school counselors and their students, said Maritza Cha, who worked as a school counselor in Southern California for nearly a decade and has taught high school counseling as an adjunct professor.
鈥淲e鈥檙e at the point of either you can kind of put your head down in the sand and pretend it鈥檚 not happening, which is not realistic,鈥 Cha said. 鈥淥r you can just acknowledge that they鈥檙e using some kind of AI鈥 in the admissions process.
Counselors can model proper use of AI in the college search
While much of the work in setting clear guidelines needs to happen at the college level, there are steps high school educators can take.
Rubin believes that if counselors and teachers are really thinking about leveling the playing field between first-generation college students from low-income families and their peers, it might be helpful to show how generative AI can ethically guide the college admissions process.
For instance, students could put areas of study they are interested in and a desired geographic region into a tool like ChatGPT and ask for recommendations on where to apply.
鈥淕enerative AI can provide them some really concrete information,鈥 Rubin said. Even though they should check that data against more accurate sources, it can help a student narrow their search.
Students can even have a 鈥渃onversation back and forth鈥 with AI if they don鈥檛 have access to a college counselor at school who can meet with them consistently, she said.
And they can model how to use AI to spur their creativity or proofread final drafts, without crossing the line into wholesale cheating, she said.
But, ultimately, high school educators and college officials need to have conversations about what responsible use of AI looks like, including in crafting college applications, Rubin said.
In Rubin鈥檚 view, those discussions should acknowledge that many students already have access to other types of help鈥攚hether that鈥檚 from professional consultants or parents and older siblings familiar with the process of applying to college.
Makena, for instance, thinks she can write a stronger, more personal essay than anything ChatGPT could cook up. She didn鈥檛 feel the need to pay a private counselor either, since she wanted to rely on her own voice as much as possible.
She did, however, have a low-tech, presumably cost-free assistant: Her father, who edited all 70-plus of her essays.