🧱 Building Information Integrity, One Skill at a Time
Reality Check is now Getting Better News - a shift from identifying misinformation to understanding what is truly meaningful. We look forward to sharing this new chapter with you.
We are living in a time when false or distorted information spreads faster and feels more personal than ever, yet our tools for understanding it are quietly getting better too. We look at how institutions, educators and communities are shifting from quick fixes to long term strategies that strengthen people’s capacity to navigate complex media environments.
Explore our curated media and information literacy resources to build calmer, evidence-based habits for navigating today’s information floods.
📊 Getting Better Poll
🧩 Media literacy as an ecosystem, not a quick fix
The World Economic Forum argues that media and information literacy now needs to span everything from AI and privacy literacy to news, advertising and human rights, and that fragmented “one-off” school projects are not enough in the face of AI‑accelerated disinformation. It supports this with a new “information resilience mapping model” that combines the life cycle of disinformation (from pre‑creation to post‑consumption) with a socio‑ecological view of individuals, communities, institutions and policy. The perception that media literacy is just about teaching children to spot fake news misses this systemic view, where platform design, workplace training and regulation all shape what people see and how they interpret it.
The long‑term trend is towards more actors recognizing media literacy as a shared responsibility rather than a classroom add‑on. What remains unresolved is how quickly practice and funding can catch up with this broader model, especially as generative AI keeps lowering the cost and increasing the persuasiveness of targeted false narratives.
The Better Take:
When stories spotlight a single media literacy workshop as the solution, it helps to ask where it fits in the wider system - who controls incentives, how platforms behave, and whether adults are learning too.
🏛️ Disinformation policy moving from panic to planning
The Council of Europe’s “Ten building blocks” describes disinformation as a long‑running “wicked problem” that undermines public trust, but it also shows governments slowly moving from scattered reactions to more structured national strategies. The document calls for evidence‑based approaches that prioritize research, media and information literacy, and support for quality journalism and election integrity, while explicitly warning against over‑reliance on criminal law and free‑expression‑chilling quick fixes.
Over the past decade, international standards and promising national strategies in countries like Ireland, Norway and Latvia suggest a slow trend toward more coherent, rights‑based frameworks, even if implementation is uneven and often under‑resourced.
The Better Take:
When you see claims that one law or one platform tweak will “solve” disinformation, it is worth looking for whether there is a broader, rights‑respecting strategy behind it, or just a short‑term reaction to the latest crisis.
🎓 Media literacy in classrooms as democracy infrastructure
The NYSUT frames media literacy as essential for “democracy‑ready” students, highlighting that young people spend around eight and a half hours a day on screens while many admit they lack the skills to evaluate what they see, and surveys show large majorities of students struggle to judge credibility or distinguish news from ads and dubious sources. Educators and librarians describe practical strategies such as teaching students to check for bias, question whether content may be AI‑generated, use “lateral reading” with extra tabs, and rely on curated search tools that surface vetted sources, while also noting the decline in full‑time school librarians and time for civics and media literacy.
The perception that teenagers are “digital natives” who intuitively understand online information ecosystems does not match the evidence that many lack background knowledge and structured guidance to spot manipulation or agenda‑driven content.
The Better Take:
Whenever you hear that young people “just need to be more careful online,” it helps to ask whether they actually have structured time, skilled mentors and stable institutions backing those skills, or mostly good intentions and guesswork.
