Follow ZDNET: Add us as a preferred source on Google.
ZDNET’s key takeaways
- There is growing concern about student dependence on AI.
- Today’s computer science grads might understand less about IT systems.
- Some technology professors are pushing back against AI in classrooms.
Whether you are studying information technology, teaching it, or creating the software that powers learning, it’s clear that artificial intelligence is challenging and changing education. Now, questions are being asked about using AI to boost learning, an approach that has implications for long-term career skills and privacy.
Also: Got AI skills? You can earn 43% more in your next job – and not just for tech work
Stephen Klein, instructor at the University of California, Berkeley, recently described an eye-opening assignment he provided his students on day one of class. He posed a question to address in a short essay, ‘How did the book Autobiography of a Yogi shape Apple’s design and operations philosophy?’ As Klein explained:
“A week later, 50 papers come in. And, like clockwork, about 10 look similar. (It is usually about 10% of them). I put them up on the big screen. I don’t say a word. I just let the silence work. (I watch the blood drain out of some of their faces.)”
“The class sees it instantly. Same structure. Same voice. Same hollow depth. That’s when I explain: this is what happens when you let a machine that is essentially a stochastic probabilistic auto complete engine sitting atop the same technology sucking up the identical database eating itself and being fed the same prompt looks like.”
“This is what happens when you outsource your ability to think and let a machine do your thinking for you.”
This is the kind of angst plaguing the educational world, from grade school to universities, with people concerned about how to balance the need for tech literacy and skills for future job roles with the need for original, critical thinking required to build long-term careers and businesses.
Part of the balance is the benefits for educators from using emerging technology. For example, AI agents in classrooms promise to help teachers with key areas, such as lesson planning and instruction, grading, intervention, and reporting. A majority of parents believe AI adoption in classrooms is critical to their children’s education.
Also: 5 ways to fill the AI skills gap in your business
But there is growing pushback against over-reliance on AI. In June 2025, a group of 14 technology professors co-authored an open letter calling on educational institutions “to reverse and rethink their stance on uncritically adopting AI technologies. Universities must take their role seriously to a) counter the technology industry’s marketing, hype, and harm; and to b) safeguard higher education, critical thinking, expertise, academic freedom, and scientific integrity.”
The paper’s authors urge educational leaders to act “to help us collectively turn back the tide of garbage software, which fuels harmful tropes (e.g. so-called lazy students) and false frames (e.g. so-called efficiency or inevitability) to obtain market penetration and increase technological dependency. When it comes to the AI technology industry, we refuse their frames, reject their addictive and brittle technology, and demand that the sanctity of the university, both as an institution and a set of values be restored.”
Is there too much AI being pushed into curricula? And should STEM students be encouraged to understand the logic behind the solutions that technology delivers?
Also: Why AI chatbots make bad teachers – and how teachers can exploit that weakness
There may even be a visible diminishment in the quality of computer science learning now taking place, said Ishe Hove, associate researcher with Responsible AI Trust and a computer science instructor. “It’s not the same quality as the computer scientists we graduated 10 years ago,” she stated in a recent webcast hosted by Mia Shah-Dand, founder and president of Women in AI Ethics.
“What the graduates of 2023, 2024, and 2025 know now is how to prompt a code assistant technology, how to prompt ChatGPT, how to debug and use these assistive coding technologies,” Hove said.
“But they don’t have the technique of understanding the actual concepts, of understanding algorithms without using AI tools. Even the educators are kind of also falling short to that end, where the emphasis is on teaching tools instead of actually building the foundational skills and that mindset and competency that they will need for the long term.”
Also: Is AI a job killer or creator? There’s a third option: Startup rocket fuel
Hove recounted how, as a data science and AI instructor herself, “there was a temptation to do less teaching and to just teach them how to prompt ChatGPT and Gemini to get solutions or how to use particular software to debug their solution.”
However, she recognized that when she asked students to “walk me through their code and validate what they did, they had no idea what was happening. So I ended up enforcing a rule in class where I give them exercises and assessments while watching them and observing. And I insist that they actually learn how to code manually, inputting functions and stuff like that with their own efforts.”
Leaving too much education to AI results in gaps in the long-term skills needed to succeed in a future economy. “If educators teach a particular software at the expense of foundational skills, by the time our students leave either university or high schools, they’re ill-prepared in terms of doing life,” said Hove, “because their critical thinking and creativity and other soft skills, like working in a team, were undermined. By the time they get an employment opportunity or internship, they have no idea how to apply themselves in that context.”
The key to success is not to use AI for AI’s sake — there has to be tangible value, said Amelia Vance, president of the Public Interest Privacy Center and professor at William and Mary Law School, a participant in the webcast: “The best and truly most innovative AI tools are not those that are blowing our minds. It’s making sure that the technology is serving a purpose where we haven’t had sufficient tools before this.”
Also: Jobs for young developers are dwindling, thanks to AI
For example, she said, “the best use case in education is a tool that is connecting underlying curricular standards to education technology products that schools have already signed up for. And helping teachers brainstorm about their lessons and saying, ‘Okay, this is a video not directly on this topic that will help students grasp a particular concept.’ So, it’s very practical.”
The best approach to AI is to “make sure that products are vetted properly,” Vance added. “And make sure that we are being careful and deliberate about adoption using AI as a tool and not as the goal in and of itself.”
Leave a Reply