AI in education: a chance to transform how we learn and think
AI isn’t a hype or a passing trend. It’s a technology reshaping the very fabric of education. And yet, despite the rapid rise of ChatGPT, Suno, Midjourney and countless other tools finding their way into our lives and classrooms, many educators are grappling with questions like:
“What if a student lets AI write their entire assignment?” or “How will students learn to think if AI already gives all the answers?”
This blog isn’t another tools list or prompt cheat sheet. It’s a call for awareness. Because if we use AI wisely, it can become the most powerful learning partner education has ever known. But that requires critical choices, ethical reflection, and – above all – the right questions.
Why AI is more than just the next tech trend
Education has seen its fair share of digital revolutions: VHS tapes, DVDs, the internet, smartboards, YouTube… All of them made knowledge more accessible, learning more visual or interactive. But AI? AI is something else entirely.
AI doesn’t just offer information, it mimics thinking. It can summarise, reason, create, persuade, even debate. It doesn’t just give answers; it pretends to ask questions, simulates empathy, and imitates emotional intelligence. AI isn’t just a new kind of encyclopedia or fancy presentation tool. It’s a technology that, if used with intent, can act as a thinking partner, something (or someone?) that challenges you to think deeper, look further, doubt more. But only if you stay in control and keep thinking for yourself.
AI isn’t just changing what we learn, it’s transforming how we think, how we learn to question, and how we engage with knowledge. That’s why educators must approach AI not only technically, but also philosophically and ethically
Critical thinking with AI: a non-negotiable skill
Using AI consciously means more than crafting clever prompts. It’s about understanding what technology does to us, mentally, socially, and structurally. It means asking the hard questions: Where is this data from? Who decides what’s true? How does the model even work?
We’re seeing growing reports of AI models hallucinating sources and confidently presenting falsehoods as facts (see: Tweakers, Royal Society). In some debates, AI now beats humans on persuasive power, even when it’s dead wrong (Washington Post).. Bias and hallucinations are just the beginning. There’s also the black-box nature of AI (how does it actually think?), the risk of data misuse, environmental impact, confirmation bias, the loss of human judgment, and the immense power concentrated in the hands of just a few tech giants.
And then there’s how easily tech can be abused: one click is all it takes to create deepnudes from a single profile photo. Just because something is easy to do, doesn’t mean it’s ethically okay. That awareness? Non-negotiable.
That’s why critical and ethical thinking around AI isn’t optional. Because if we only teach students how to prompt, and not how to question and reflect, we risk repeating the same mistakes we made with social media. And we’ve seen how that turned out:
-
Invisible influence and polarisation
-
Lack of transparency and viral misinformation
-
Loss of autonomy through addictive algorithms
AI in the classroom: students need more than skills
Using AI is one thing. But teaching students to question what goes in – and what comes out – is where the real impact happens. That’s how we help them develop tech literacy, ethical awareness, and the kind of brains that aren’t afraid to doubt.
That’s when AI shifts from answer machine to thinking partner. And that’s when we empower learners to protect and preserve their autonomy. Outsourcing your thinking to AI? Not a good idea. Learning faster, thinking bigger, creating smarter with AI? Absolutely, as long as you stay in the driver’s seat.
Education is the place to build AI literacy that matters If we want the next generation to use AI responsibly, we have to start now , not just teaching what tools can do, but helping students explore what they want to do with them. And what it costs. If we start developing digital thinking skills today, we’ll carry that mindset into every tech wave that’s still to come.
Where to Start: Ethical & Critical Thinking with AI
Below are five essential focus areas to help educators start thinking more deeply about how AI is used in education.
I’ve added a sample question to each domain, and if you’re ready to dive in, you can download our free AI Question Pack at AI-Brainlab.nl to get started right away. It’s in dutch but with a little help from a translator tool that should be fine.
- Educational Purpose First– Does this AI application lead to deeper understanding, or just shinier results?
- Privacy & Data Ethics– Do you know where your data goes, why, what happens to it?
- Fairness & Accessibility– Who’s excluded or misrepresented in your AI tool’s output?
- Transparency & Responsibility– Where and how did you use AI in your assignment, and for what reason?
- Human Agency & Autonomy–Is this helping you learn, or just helping you make the deadline faster?
Raising Tech Awareness Among Young People
ech awareness isn’t just about knowing how AI works. It’s about thinking through what technology does to people, society, and the planet. Who benefits? Who pays the price?
By teaching students to use AI critically, ethically, and creatively, we’re not just training users – we’re raising shapers of technology. That doesn’t require a coding background. It requires diverse voices, good questions, and space for dialogue.
AI in the classroom shouldn’t be an endpoint. It should be a starting point for deeper thinking. Because the real power of AI isn’t in the answers it gives, but in the questions we dare to ask.
AI Transparency Note:
This blog was created in collaboration with AI, including text and images. At AI-Brainlab, we believe tech should enhance thinking, not replace it. That’s why we use AI tools consciously, as sources of inspiration, clarity, and reflection.
All final edits, vision, and content choices were made by humans, as is our responsibility.
Tools used: ChatGPT, Perplexity, and Midjourney.
Related Posts
4 februari 2025
From Plato’s Cave to Algorithmic Bias: Challenging AI’s ‘Truth
Don't outsource your thinking to AI! By…