Beyond the establishment of official personalized learning support, AI is widely used among students due to its easily accessible nature. Even with institutional concerns regarding academic integrity related to AI, the trend of students extensively using AI in their studies is difficult to curb. According to data released in 2025 by the Higher Education Policy Institute (HEPI), the percentage of undergraduate students who have used ChatGPT has surged from 66% last year to 92% this year. Compared to data from previous years, this percentage continues to show an upward trend. This has evidently raised new concerns for institutions regarding AI and academic integrity, alongside ongoing discussions about the boundaries of cheating. However, among these students, even though only 29% report that their institutions “encourage” the use of AI, nearly nine out of ten students still use AI in assessments(Student Generative AI Survey 2025 – HEPI, 2025), indicating the inevitability of widespread student adoption of AI. In fact, beyond surveys in higher education, research in computer science education is reaching similar conclusions. In an evaluation study on the application of large language models in Python courses, Jaromír Šavelka, a professor of Computer Science and Computing Education at Carnegie Mellon University, and his colleagues explicitly stated that programming instructors need to prepare for a world in which “there is an easy-to-use widely accessible technology that can be utilized by learners to collect passing scores, with no effort whatsoever, on what today counts as viable programming knowledge and skills assessments”(Savelka et al., 2023). All of this evidence points towards AI becoming an easily accessible and widely utilized learning tool for students, a trend that is difficult to suppress.
Such vast and extensive data inevitably leads to curiosity about the primary purposes for which students use AI and raises the question of whether proper guidance can mitigate the harms associated with indiscriminate AI use. According to the HEPI report, the most common ways students use AI are to ‘explain concepts’ (58%), ‘summarise a relevant article’ (48%), and ‘suggest research ideas’ (41%)(Student Generative AI Survey 2025 – HEPI, 2025). This data strongly suggests that students predominantly use AI as a channel for quick, personalized knowledge acquisition. This finding was corroborated just six months later by a survey of students in German dual-system vocational schools. There, students concentrated their use of AI tools on “research” (81.2%) and “exam preparation” (72.1%)(Gerstung-Jungherr & Deuer, 2025), similarly indicating that students primarily view AI as a tool for acquiring subject knowledge and review materials. Indeed, an empirical study from Swiss higher education, published in January of the same year, had previously indicated that students use AI mainly to simplify and clarify course materials, explain concepts not fully covered, or summarize lengthy texts – behaviors essentially centered around knowledge acquisition. However, that same study also showed that when comparing students’ metacognitive applications to cognitive tasks (the ones mentioned earlier), students exhibited lower prevalence in using AI for metacognitive tasks, which require a higher level of self-awareness and self-regulation. This might be because “students are unfamiliar with the relevant functions of AI tools or hesitate to use them for tasks that require more intensive personnel”(Spirgi & Seufert, 2025). Regardless, it casts doubt on students’ prioritization of AI for metacognitive tasks – that is, tasks involving students’ metacognitive skills: planning, monitoring, and evaluating one’s own learning.
(to be continued)



留下评论