What if I told you AI’s greatest gift isn’t speed and ease, but space and reflexton? Two weeks ago at Gen Jam, we created short films using AI in a 3 hour frantic sprint. The process was thrilling, weird, confusing, and incredibly fun. But the biggest revelation for me wasn’t the AI output… it was what happened in the time we waited.

Prompting ≠ Searching: It’s More Like First Contact

Good prompting isn’t about typing keywords into a search bar or memorizing fancy commands and prompts from a cheat sheet. It’s not even just about getting answers, it’s about building a kind of intentional dialogue.

Think of it this way:

  • Search engines are librarians fetching known books.
  • AI acts more like an alien collaborator, one trained on a lot of human knowledge, filled with  facts and brilliance, but also bias, gaps in understanding and blind spots. Every response it gives is a reflection of all that data.

At Gen Jam, I saw this firsthand. Some prompts failed outright. Others generated unsettling stereotypes or flat, uninspired results. But instead of treating the AI like a broken tool, we treated it like a strange new collaborator. We rephrased, debated, and experimented, not just to get better outputs, but to understand why the AI responded the way it did.

That’s the real shift: AI isn’t a shortcut to creativity, it’s a mirror, an amplifier, and sometimes a sparring partner. The magic isn’t in the first answer it gives you. It’s in the loop: Prompt. Pause. Reframe. Repeat.

By the end, we weren’t just using AI, we were jamming or co-creating with it. And that’s the skill we all need to learn. Not cheat codes and prompt cheatsheets. Not rigid formulas. But the ability to ask good questions, listen actively, and adapt responsibly.

Because this isn’t just another piece of tech. It’s the first member of a new species, and we’re all just starting to speak its language. Or maybe think of it as a new ecosystem, one we’re all learning to navigate ethically.

This isn’t about slowing down, it’s about learning to drive better, faster, and with more control.

AI’s Ironic Gift: Time to Think

Here’s what surprised me most: the most human moments happened because of AI, not in spite of it. We’d send off a prompt—then wait. For images to render, scripts to generate, or dialogue to rewrite.

Tech promised to eliminate downtime, but here’s what it gave us instead:

  • Jokes that turned into inside stories.
  • Debates about why the AI hallucinated a polka-dotted sunset.
  • Silences where someone would suddenly say, “Wait, what if we tried…?”

We’re conditioned to think faster tech means faster thinking. But in this case, the delay gave us something rare: space to slow down. To reflect. To connect.

Turns out, the most valuable tool wasn’t the one that saved us most time. It was the realization that AI gave us time, to use wisely.

We’ve Been Here Before—Let’s Do Better This Time

We’ve seen powerful tech reshape human behavior before.

  • We didn’t teach people how algorithms curated and filtered their view on the world.
  • We underestimated how the attention economy rewired our habits.
  • We called social media a “tool for connection” while it created silos/ echo chambers and amplified polarization.

Today, we’re playing catch-up. Let’s not make the same mistake with AI.

AI developments won’t wait for us to catch up.

This isn’t just another app. It’s a system that responds to how we use it, and what we fail to question. That’s why this time, we need to engage differently. Not with blind trust. Not with fear. But with curiosity, intention, and some uncomfortable questions.

A License to Drive, Not a Repair Manual

You don’t need to engineer an AI to work with one, just like you don’t need to build a car to drive it. But you do need to:

  • Learn the rules of the road (How does AI “think”? Where does it fail?),
  • Spot the warning lights (When is it inventing stuff? Reinforcing bias? Privacy risks?),
  • Keep your hands on the wheel (The AI suggests, you steer, you decide what and how to use that output).

Passive use isn’t harmless, it just means someone else is driving. And that “someone” might be a for-profit big tech algorithm with no stake in your classroom, campaign, or community.

The Goal Isn’t Expertise—It’s Fluency

We cannot all be AI developers and experts. But we do need a baseline of AI-wisdom  for everyone so people learn how to:

  • Prompt like a conversation and asking good questions, not a demand,
  • Question outputs, never just accept them. Be skeptical of outputs, not seduced by their ease
  • Create with AI, not just through it.
  • Co-create with awareness, crediting sources, checking bias, and asking who is missing from the data

The best parts of Gen Jam weren’t the outputs, they were the debates, the surprises, the human moments between the prompts. That’s a future I’d love to build: where AI deepens thinking, never replaces judgment, ethical and critical thinking.

Let’s hand the next generation a map and a compass, not a user agreement.
Let’s teach them to think with and about AI and get better at it together.

AI Transparency Note

Blog written with help from ChatGPT and DeepSeek to improve text and review arguments.
Image created with ChatGPT

Privacy Preference Center