The Days of ‘Just Google It’ Are Over: Why We Must Teach Students to Dig for Truth

I’ve lost count of how many times I’ve said it over the years. A student asks me something towards the end of class, I don’t know the answer off the top of my head, and I respond: “Google it and find out the answer for me.” It’s been the default response for years. A simple gesture that says we’re all learners together.

Those days are over.

Empty Guidance, Real Problems

When the Department of Education’s “Guidance on Artificial Intelligence for Irish Schools” finally arrived in October 2025, after an 18-month delay, my first reaction was disappointment. The document felt anticlimactic, even empty. Whilst it provided permission for teachers to use AI for lesson planning and resource creation, it offered virtually nothing to help us address the most pressing challenge: our students’ use of AI. We were left, once again, to figure it out ourselves.

And figure it out we must, because the problem is already here. A case in point happened in a recent class I was teaching.

A History Lesson in AI Confusion

So last week I’m teaching Junior Cycle History. We’re watching a Youtube video about Michelangelo. My students already know about his genius as a sculptor, painter and architect. Then the video mentions he designed the Swiss Guard uniforms, complete with all the bows, flourishes and amazing colours one might expect from a Renaissance artist with boundless flair. My first reaction? Brilliant. Another string to his bow; Michelangelo was a fashion designer too. How I wished I’d known this sooner and shared it with the hundreds of history students I’ve taught over the past 30 years.

But then my instincts as a historian kicked in.

So naturally, I searched Google to verify this claim. What appeared at the top? Google’s “AI Overview” stating confidently that yes, Michelangelo had indeed designed the Swiss Guard uniform.

But now another instinct kicked in: my scepticism towards anything generated by AI.

With Google’s AI Overviews, you have to make a deliberate effort, pressing the “show more” button, to discover the real truth: one of the cited sources Britannica stated that, “contrary to legend, Michelangelo did not design them. In fact, the design is largely the work of Jules Repond, who served as commander of the guard between 1910-21.”  This was also verified by the Swiss Guards official website https://schweizergarde.ch/paepstliche-schweizergarde/en/about-us/uniforms/ 

However, Google still presented the false claim as fact at the very top of search results, where students instinctively look first. The correct information was hidden away, requiring extra clicks to find. Google’s disclaimer that “AI can make mistakes” is only visible at the very bottom of the output, and even then it requires the user to press the “show more” button.

This is the reality our students face every day. 

But they won’t always have a teacher like me to force them to dig further to the truth buried beneath layers of algorithmic “certainty”.

The Reality of Student AI Literacy

This isn’t just about one incorrect answer. DCU’s Anti-Bullying Centre’s March 2025 report entitled “Exploring How Young People Navigate the Evolving Online World in the Era of Artificial Intelligence and Misinformation” reveals the scale of the challenge we’re facing.

Surveying 109 Irish post-primary students, researchers found sobering results:

  • Only 22% felt confident recognising deepfake or AI-generated content
  • Over 10% admitted having no confidence whatsoever in identifying such material
  • Fewer than one in three correctly identified false political news
  • Only 27.5% recognised fabricated technology stories

Perhaps most concerning: whilst 61.5% regularly use AI-powered filters and 35.8% have encountered deepfakes, the study revealed “varying levels of understanding” about how these technologies actually work.

The researchers concluded that among Irish teenagers  “gaps exist in advanced digital literacy, particularly in identifying advanced forms of misinformation.” They explicitly called for integration of “AI literacy” into Irish school curricula to address these deficiencies. 

This evidence directly supports the urgent need for educational intervention. Our students are navigating an AI-saturated information landscape without the critical literacy skills to distinguish fact from algorithmically-generated fiction.

We’re All Part of Google’s Experiment

Google began rolling out AI Overviews in May 2023, expanding them dramatically from March 2024. By autumn 2024, this feature reached over 100 countries, including Ireland. By now you’ve probably noticed that “AI Overviews” now trigger in the majority of Google searches for most users worldwide.The initial phase was disastrous. The system produced bizarre or incorrect answers: recommending people eat rocks, suggesting glue for pizza cheese. These errors came from sourcing unreliable or satirical internet content.

Google has made massive improvements since then. But the system still occasionally makes mistakes and can seem simplistic or overconfident. Here’s the uncomfortable truth: for many users, AI Overviews are now an unavoidable part of using Google Search.

Digital Natives or Naivety?

Teenagers are often called “digital natives.” It’s a reassuring phrase that suggests innate competence, as if being born into a world of smartphones automatically confers wisdom about how they work.

But anyone who has taught a digital literacy class will tell you something different. These students are certainly digital, yes. But what I’m witnessing in classrooms isn’t native fluency. All too often it’s digital naivety.

Growing up with technology doesn’t automatically mean understanding how to critically evaluate what it produces. They can swipe, scroll and post with impressive speed, yet struggle profoundly to evaluate whether what they’re reading is true. They’re fluent in the interface but far too many are illiterate in the critical thinking required to navigate it safely. And they are far too trusting, as the DCU report suggests.

Time to Dig for Truth

As I said, I am a history teacher. During World War II, the British government issued propaganda posters declaring “Dig for Victory”, encouraging citizens to grow their own food as a means of self-sufficiency in wartime. Today, we need a similar campaign: “Dig for Truth“.

By The Way: My version was made with AI.

Just as those wartime posters urged people not to rely passively on others for their food, we need to teach our students not to rely passively on AI for their information. Self-sufficiency in the digital age means digging beneath that first AI-generated answer, questioning what appears at the top of search results, and doing the work to verify claims for themselves.

We can no longer tell students to “just Google it” and assume they’ll find reliable information. The skills they need must be explicitly discussed, demonstrated and modelled in our classrooms.

What We Need to Teach

Students need to learn how to question AI-generated content that appears at the top of search results, regardless of how confident it sounds. They need to press “show more” and look beyond the AI summary to find actual sources and verification. They need to cross-reference information across multiple independent sources before accepting claims as fact. And they need to understand AI’s limitations, particularly its tendency to present fiction as fact with absolute confidence.

We cannot wait for perfect policies or comprehensive guidelines. The Department’s guidance acknowledged AI’s potential benefits but left a gaping hole where practical classroom strategies should be.

Students are using these tools right now, encountering AI-generated content in every search. We need to act.

  • Model critical thinking by openly questioning AI outputs in class and demonstrating verification processes in real time.
  • Share real examples of AI errors when we encounter them, turning mistakes into teaching moments.
  • Teach verification as routine practice, not as a special digital literacy lesson but as part of everyday subject teaching.
  • Acknowledge our learning journey together, showing students that navigating AI-generated information is a challenge for everyone, including teachers.

The New Reality

The burning question isn’t whether AI will transform information access. It already has, in terms of Google Search.

The question is whether we’ll prepare our students to handle that transformation, teaching them to look beyond the confident AI summary at the top of their search results and dig for the truth underneath.

It’s very clear that tech giants like Google and Microsoft are determined to push AI integration, regardless of whether users want it or feel prepared for it. That makes our role as educators more vital than ever. Version 2 of the Department’s AI guidelines must help in this regard, and it must be accompanied by appropriate training frameworks that will help teachers take on this immense challenge with confidence.

The days of “just Google it” are over.

Now we must teach them to dig.


References:

Esfandiari, M., Aşçı, S., Feijóo, S., Reynolds, M., O’Toole, C., McGarrigle, J., Heaney, D., & O’Higgins Norman, J. (2025). “Exploring How Young People Navigate the Evolving Online World in the Era of Artificial Intelligence and Misinformation”, DCU Anti-Bullying Centre. ISBN: 978-1-911669-84-5.


Patrick Hickey (@aiteachingguru on all major social media) is an AI CPD Provider, Media Contributer, Current Teacher, Assistant Principal in Boherbue Comprehensive School, Co. Cork.

If you have any queries you can email aiteachingguru@gmail.com

Scroll to Top