For many students, ChatGPT has become as standard a tool as a notebook or a calculator.
Whether it’s tidying up grammar, organising revision notes, or generating flashcards, AI is fast becoming a go-to companion in university life. But as campuses scramble to keep pace with the technology, a line is being quietly drawn. Using it to understand? Fine. Using it to write your assignments? Not allowed.
According to a recent report from the Higher Education Policy Institute, almost 92% of students are now using generative AI in some form, a jump from 66% the previous year.
“Honestly, everyone is using it,” says Magan Chin, a master’s student in technology policy at Cambridge, who shares her favourite AI study hacks on TikTok, where tips range from chat-based study sessions to clever note-sifting prompts.
“It’s evolved. At first, people saw ChatGPT as cheating and [thought] that it was damaging our critical thinking skills. But now, it’s more like a study partner and a conversational tool to help us improve.”
It has even picked up a nickname: “People just call it ‘Chat’,” she says.
Used wisely, it can be a powerful self-study tool. Chin recommends giving it class notes and asking it to generate practice exam questions.
“You can have a verbal conversation like you would with a professor and you can interact with it,” she points out, adding that it can also make diagrams and summarise difficult topics.
Jayna Devani, the international education lead at ChatGPT’s US-based developer, OpenAI, recommends this kind of interaction. “You can upload course slides and ask for multiple-choice questions,” she says. “It helps you break down complex tasks into key steps and clarify concepts.”
Still, there is a risk of overreliance. Chin and her peers practise what they call the “pushback method”.
“When ChatGPT gives you an answer, think about what someone else might say in response,” she says. “Use it as an alternative perspective, but remember it’s just one voice among many.” She recommends asking how others might approach this differently.
That kind of positive use is often welcomed by universities. But academic communities are grappling with the issue of AI misuse and many lecturers have expressed grave concerns about the impact on the university experience.
Graham Wynn, pro-vice-chancellor for education at Northumbria University, says using it to support and structure assessments is permitted, but students should not rely on the knowledge and content of AI. “Students can quickly find themselves running into trouble with hallucinations, made-up references and fictitious content.”
Northumbria, like many universities, has AI detectors in place and can flag submissions where there is potential overreliance. At University of the Arts London (UAL) students are required to keep a log of their AI use to situate it in their individual creative process.
As with most emerging technologies, things are moving quickly. The AI tools students are using today are already common in the workplaces they will be entering tomorrow. But university is not just about the result, it is about the process and the message from educators is clear: let AI assist your learning, not replace it.
“AI literacy is a core skill for students,” says a UAL spokesperson, before adding: “Approach it with both curiosity and awareness.”