The Growing Dependency on AI in Academia

The Growing Dependency on AI in Academia

By: Raaid Bakridi CIPP/US

I. Introduction

In the 21st century, Artificial Intelligence (“AI”) has become an integral part of daily life. From virtual assistants like Siri and Alexa to machine learning algorithms powering recommendation systems,[1] AI is undeniably everywhere;[2] increasingly, it is becoming normalized in daily life.  As U.S. Vice President JD Vance puts it, AI presents an “extraordinary prospect of a new industrial revolution, one on par with the invention of the steam engine.”[3]

AI has also made significant strides in education and academia, offering tools that assist students with research, outlining, essay writing, and even solving complex mathematical and technical problems.[4] However, this convenience comes at a cost. An analysis of AI tutors highlights their potential to enhance education while also raising concerns about overreliance on technology.[5] Rather than using AI as a supplement, many students rely on it to complete their work for them while still receiving credit, which poses challenges to academic integrity and the role of AI in learning.[6] This growing dependence raises concerns about its impact on creativity, critical thinking, overall academic performance, and long-term career prospects. Students are becoming more dependent on AI for their schoolwork, and the potential dangers of this dependency raises significant concerns and implications for their future.[7] If students continue to let AI think for them, the future of our nation will face extreme challenges.

II. The Rise of AI in Academia

AI tools have become increasingly popular among students for various academic tasks. These range from basic grammar checkers like Grammarly and plagiarism detectors like Turnitin to more advanced AI-powered research assistants and text generators such as Lexis+ AI, Westlaw CoCounsel, ChatGPT, Microsoft Copilot, and DeepSeek.[8] Tools like Grammarly and Turnitin have long been staples in academic settings, helping students refine their writing and ensure originality. More recently, AI-driven platforms such as OpenAI’s ChatGPT and Microsoft Copilot have gained traction for their ability to generate coherent, contextually relevant text, making them appealing to students looking to streamline their writing process.[9]

The appeal of these tools is understandable. They offer a quick and efficient way to complete assignments, often requiring minimal effort from the user.[10] For instance, a student struggling with a history essay or pressed for time on a research project can simply input a prompt into an AI text generator and receive a well-structured draft within seconds. AI-powered research tools can also scan vast amounts of data to provide relevant sources almost instantly, saving hours of manual research.[11] In academia, however, these shortcuts come at a cost. While these tools can serve as helpful assistants, students must engage with the learning process to develop critical thinking, analytical skills, and the ability to articulate ideas effectively.[12] Overreliance on AI risks bypassing essential learning experiences, ultimately hindering students’ ability to process information and express themselves independently.[13]

III. The Problem with AI Dependency

The growing reliance on AI in academia raises significant concerns about creativity, critical thinking, academic performance, and ethical integrity.[14] One major issue is the erosion of creativity and imagination.[15] Creativity is a cornerstone of learning, enabling students to think independently, solve problems innovatively, and express themselves uniquely.[16] Yet, when students use AI to generate essays or solve basic math problems, they bypass the creative and learning process entirely.[17] While the output may be well-structured, it lacks the personal touch, originality, and understanding that comes from our independent thought process.[18] Over time, this dependency can stifle a student’s ability to develop their own voice, a skill that enhances self-expression and critical engagement.

However, while having one’s own voice is valuable, it is not as fundamental as literacy. The ability to read and write remains the foundation of education, and an overreliance on AI could undermine even these essential skills.[19] If students become dependent on AI-generated content, they risk weakening their ability to engage with complex texts, construct coherent arguments, and communicate effectively—abilities far more critical to academic and professional success than simply having a unique voice.[20] As one study found,

[t]his over-reliance could lead to a range of issues, including diminished critical thinking, analytical thinking, and decision-making abilities susceptibil[e] to AI-generated errors or AI hallucinations, increased instances of plagiarism, and challenges related to lack of transparency and algorithmic biases. Moreover, habitual dependence on AI for decision-making may reduce individuals’ motivation to engage in independent thinking and analysis, potentially leading to a weakening of essential cognitive abilities and automation bias.[21]

Critical thinking is another casualty of AI over-reliance. Analyzing information, evaluating arguments, and making well-reasoned judgments are essential for academic and career success.[22] However, when students depend on AI for answers, they fail to engage deeply with the material.[23] For instance, using an AI-powered math solver to complete homework may yield correct answers, but it does not teach the problem-solving process or underlying concepts. This superficial engagement can hinder students’ ability to apply knowledge in real-world scenarios.[24] Academic performance may also suffer. While AI tools help students complete assignments quickly, they do not necessarily improve learning outcomes.

Research shows that students who rely heavily on AI often perform worse on assessments requiring independent thinking and problem-solving.[25] For instance, a study by the University of Pennsylvania revealed that:

Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.[26]

Consequently, students utilizing AI-powered tutoring systems demonstrated lower performance on standardized tests compared to those receiving traditional instruction.[27] Moreover, Ethical concerns further complicate the issue. AI tools can be misused to generate essays or plagiarize content, undermining academic integrity.[28] This creates an uneven playing field, where students who use AI gain an unfair advantage over those who do not, fostering a culture of dishonesty that threatens the credibility of academic institutions.[29]

IV. The Dangers of AI in Academia

The risks of AI in academia extend beyond individual performance, impacting the broader educational landscape and society.[30] One pressing concern is the loss of human interaction in learning.[31] Education is not just about acquiring knowledge; it is also about building relationships, character, and social skills.[32] As the U.S. Department of Education’s mission statement puts it, the goal of education is “to promote student achievement and preparation for global competitiveness by fostering educational excellence and ensuring equal access.”[33] Teachers provide mentorship, emotional support, and adaptability that AI cannot replicate. As Sir Ken Robinson noted, “[t]eaching is a creative profession, not a delivery system.”[34] Over-reliance on AI risks depriving students of these meaningful human interactions, which are essential for personal and social development.[35]

AI also exacerbates educational inequality. Access to advanced AI tools is uneven, with students from disadvantaged backgrounds often lacking the resources to compete with their more privileged peers.[36] This digital divide could widen achievement gaps and reinforce existing inequities in the education system.[37] Moreover, an over-reliance on AI in education mirrors a broader societal dependence on technology.[38] Students who rely on AI for academic tasks may struggle to perform basic writing, problem-solving, or analytical tasks without technological assistance.[39] This dependency could hinder the development of critical life skills, making individuals less adaptable and resilient in real-world situations.[40]

V. Conclusion and Practical Considerations

The rise of AI in academia presents both opportunities and challenges. While AI tools offer undeniable benefits, their overuse threatens creativity, critical thinking, academic integrity, and equitable access to education.[41]

However, AI is not inherently harmful. When used responsibly, it can be a powerful tool for innovation and growth.[42] The key to this responsible use lies in striking a balance between leveraging AI’s capabilities and fostering essential human skills like creativity, critical thinking, and problem-solving.[43] Promoting responsible AI use is a critical first step. Educators can integrate digital literacy and ethical AI practices into their curriculum, teaching students to use AI as a tool for brainstorming or idea generation rather than a replacement for independent work.[44] For example, students might use AI to generate initial ideas for an essay but must craft the final product in their own words. This approach ensures that AI complements, rather than replaces, independent thought.

Equally important is fostering creativity and critical thinking. Active learning strategies, such as project-based learning, encourage students to engage deeply with material, conduct research, and present original ideas. Creative writing and artistic expression can further help students develop their unique voices, reinforcing the value of independent thought. Collaboration and peer learning also play a vital role in counteracting AI dependency. Group projects and peer review exercises challenge students to refine their arguments, solve problems, and engage dynamically with the subject matter, fostering a sense of academic community and reducing over-reliance on AI. Finally, ensuring equitable access to AI tools is crucial.[45] Government funding, school initiatives, and partnerships with technology companies can help bridge the digital divide, providing all students with access to AI-powered resources regardless of socioeconomic background.[46] By addressing these challenges—promoting responsible AI use, fostering creativity and critical thinking, encouraging collaboration, and ensuring equitable access—we can prepare students to thrive in the 21st century.

 

 

 

References

[1] Glossary, Recommendation Systems, Nvidia,

https://www.nvidia.com/en-us/glossary/recommendation-system/ (last visited Mar. 5, 2025) (“[a] recommendation system is an artificial intelligence or AI algorithm, usually associated with machine learning, that uses Big Data….”).

[2] A.I. Here, There and Everywhere, The New York Times (Mar. 9, 2021),

https://www.nytimes.com/2021/02/23/technology/ai-innovation-privacy-seniors-education.html.

[3] Aamer Madhani and Thomas Adamson, JD Vance Rails Against ‘Excessive’ AI Regulation in a Rebuke to Europe at the Paris AI Summit, Associated Press (Feb. 11, 2025),

https://apnews.com/article/paris-ai-summit-vance-1d7826affdcdb76c580c0558af8d68d2.

[4] See generally John Bailey and John Warner, AI Tutors: Hype or Hope for Education?, 25, 1, Education Next,

https://www.educationnext.org/ai-tutors-hype-or-hope-for-education-forum/ (analyzing the potential of AI tutors to enhance education while addressing concerns about effectiveness, equity, and reliance on technology).

[5] Id.

[6] Id.

[7] Id.

[8] See Jon Cavicchi, Exploring Generative and Extractive AI in Legal Research: A Comparison of Leading Models, University of New Hampshire Franklin Peirce School of Law, Law Library, https://law.unh.libguides.com/blog/Exploring-Generative-and-Extractive-AI-in-Legal-Research-A-Comparison-of-Leading-Models; see also Ema Lukan, The 50 Best AI Tools for 2025 (Tried and Tested), Synthesia, https://www.synthesia.io/post/ai-tools (last visited Mar. 3, 2025).

[9] See Using AI Ethically in Writing Assignments, The University of Kansas, Center for Teaching Excellence,

https://cte.ku.edu/ethical-use-ai-writing-assignments (last visited Mar. 5, 2025) (discussing best practices for the ethical use of AI in academic writing, including transparency, proper attribution, and maintaining academic integrity).

[10] Bailey, supra note 4.

[11] Consensus AI, How to Write a Research Paper with AI Search Engines, Consensus, https://consensus.app/home/blog/how-to-write-a-research-paper-with-ai-search-engines/ (last visited Mar. 5, 2025).

[12] See generally Yunyi Zhu, The Impact of AI-Assisted Teaching on Students’ Learning and Psychology, 38 J. Educ. Human. Soc. Sci. 111 (2024) (analyzing how AI-assisted teaching tools affect students’ learning outcomes and psychological well-being, highlighting both positive and negative impacts).

[13] Id.

[14] Id.

[15] Id.

[16] See generally Elizabetta Oris, AI’s Impact on Student Creativity, The Plaid Press (Jan. 17. 2025), https://theplaidpress.com/2025/01/17/ais-impact-on-student-creativity/ (exploring how artificial intelligence tools influence student creativity, highlighting both the potential benefits and challenges in fostering original thought and innovation in educational settings).

[17] Id.

[18] Id.

[19] See generally Chunpeng Zhai et al., The Effects of Over-Reliance on AI dialogue Systems on Students’ Cognitive Abilities: A Systematic Review, 11 Smart Learning Env. 28 (2024) https://slejournal.springeropen.com/articles/10.1186/s40561-024-00316-7 (systematically reviewing studies on how excessive dependence on AI dialogue systems can negatively impact students’ cognitive development and learning outcomes).

[20] Id.

[21] Id. at 2.

[22] See generally Imad Khan, New Research Suggests Overreliance on AI Could Hinder Critical Thinking, CNET (Feb. 12, 2025),

https://www.cnet.com/tech/services-and-software/new-research-suggests-overreliance-on-ai-could-hinder-critical-thinking/ (discussing research that highlights the potential negative impact of excessive reliance on AI tools, including reduced critical thinking and problem-solving skills among users).

[23] Id.

[24] Id.

[25] See generally Sarah Thomson et al., The Impact of AI on Education and Careers: What Do Students Think?, National Library of Medicine (Nov. 14, 2024), https://pmc.ncbi.nlm.nih.gov/articles/PMC11602497/ (examining student perspectives on the role of artificial intelligence in shaping educational experiences and future career opportunities).

[26] Jill Barshay, Kids Who Use ChatGPT as a Study Assistant Do Worse on Tests, The Hechinger Report (Sept. 2, 2024), https://hechingerreport.org/kids-chatgpt-worse-on-tests/.

[27] See generally Id.

[28] Using AI Ethically in Writing Assignments, supra note 9.

[29] Id.; but see Barshay, supra note 26.

[30] Thomson et al., supra note 25.

[31] Id.

[32] See generally What is the Purpose of Education, Allison Academy, https://www.allisonacademy.com/parents/child-development/what-is-the-purpose-of-education/ (last visited Mar. 5, 2025) (explaining that the purpose of education is to develop critical thinking, foster personal growth, and prepare individuals for meaningful participation in society and the workforce).

[33] Federal Role in Education, U.S. Dept. of Ed., https://www.ed.gov/about/ed-overview/federal-role-in-education (last visited Mar. 5, 2025) (“[d]espite the growth of the Federal role in education, the Department never strayed far from what would become its official mission: to promote student achievement and preparation for global competitiveness by fostering educational excellence and ensuring equal access.”).

[34] Umes Shrestha, Why Educators Don’t Think Critically, Teaching Learning Defined (Jan. 8, 2025), https://umesko.substack.com/p/why-educators-dont-think-critically.

[35] See generally Oris, supra note 16; see also Khan, supra note 22; Thomson et al., supra note 25.

[36] See generally Paul McKean, Without Intervention, AI Could Widen the Digital Divide for Students, Digital Leaders (Sept. 25, 2023), https://digileaders.com/without-intervention-ai-could-widen-the-digital-divide-for-students/ (arguing that without targeted policies, the rapid advancement of AI technology may exacerbate educational inequalities, particularly among underserved student populations).

[37] Id.

[38] Chunpeng et al. at 31, supra note 19.

[39] Id.

[40] Id.

[41] See generally Hamsa Bastani et al., Generative AI Can Harm Learning, The Wharton Sch. Research Paper (2024), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4895486 (arguing that generative AI can negatively impact learning by promoting surface-level understanding, reducing critical thinking, and fostering dependency on AI-generated content).

[42] See generally Andres Fortino, Thinking with AI—Pros and Cons—Language, Logic, and Loops, NYU Sch. of Prof. Stud. (Sept. 5, 2023), https://www.sps.nyu.edu/homepage/metaverse/metaverse-blog/Thinking-with-AI-Pros-and-Cons-Language-Logic-and-Loops.html (exploring the benefits and drawbacks of artificial intelligence, focusing on its impact on language processing, logical reasoning, and the potential for feedback loops in AI systems).

[43] Id.

[44] Kate Grovergrys and Tina Rettler-Pagel, Leveraging AI for Equity in Teaching and Learning, Comm. College Daily (Oct. 23, 2024), https://www.ccdaily.com/2024/10/leveraging-ai-for-equity-in-teaching-and-learning/.

[45] See McKean, supra note 36.

[46] Id.