Knowing when to use AI and when to leave it the hell alone.

Everyone is teaching you how to use AI faster. Nobody is teaching you how to think about what you lose when you do.

Slow AI is a newsletter for people who refuse to outsource their judgement to a machine.

Three posts per week. Monday: Slow Takes (live AI news). Wednesday: paid-split. Friday: free.

The free posts give you the full argument. Posts with a paid section give you the argument for free, with exercises and deeper discussion behind the paywall.

Over 13,000 readers. 250+ paid members. Substack Bestseller.

Subscribe free to start reading, or upgrade to paid for the full experience.


What readers say

  • “You’re one of the few creators on here that actually makes me think. Not skim. Not nod along. Think.” — Nick Quick

  • “Your work consistently slows me down in the best way. You write about AI with care, curiosity, and humanity, creating room for reflection instead of urgency.” — Jessica Drapluk

  • “Sam is possibly the best thing about Substack. He’s created the best community, incredibly valuable content and is an expert at what he does.” — The Strategic Linguist


The Slow AI Curriculum for Critical Literacy

This is not a prompt engineering course. It is a 12-month programme for learning when AI helps, when it harms, and how to tell the difference. All paid subscribers have access to this.

What you get for £100/year:

  • 12 monthly live seminars (45 min each, last Thursday of the month, 20:00 UK time via Zoom) with full recordings

  • A structured curriculum covering one critical theme per month, grounded in peer-reviewed research

  • CPD-accredited Certificate of Critical AI Literacy on completion (25 credits, The CPD Group, Course #1019972). Tax-deductible as professional development. Recognised internationally.

  • Moderated dialogue with current members across education, policy, and leadership

  • Full access to every paid section, past and future, including session recordings, research syntheses, and critical takeaways

Students and lower-income subscribers: email sam.illingworth@gmail.com for a reduced rate. No questions asked.

For teams and institutions: visit samillingworth.com/institutional.

The 12 monthly themes:

  1. The Myth of Neutrality (Data Origins and Bias)

  2. Synthetic Empathy (Affective Computing and Care)

  3. Security and Surveillance (Privacy and Institutional Control)

  4. The Labour of AI (Ghost Work and Extraction)

  5. Creative Agency (Mimicry versus Imagination)

  6. Ecological Costs (Environmental Impacts)

  7. Memory and Forgetting (Data Retention and Identity)

  8. Epistemic Crisis (Deepfakes and the Collapse of Truth)

  9. AI and Student Belonging (Pedagogy versus Policing)

  10. Digital Colonialism (Algorithmic Hegemony and Cultural Erasure)

  11. Algorithmic Governance (Automated Law and Ethics)

  12. The Slow AI Manifesto (Developing a Personal Ethical Framework)

Read the full 2026 Handbook for Critical AI Literacy for the complete syllabus.


Founding membership

Everything in the paid membership, plus your name listed as a Founding Member of Slow AI. 20 spots. When they are gone, they are not coming back.

£300/year.

Founding members:

Jenny Boavista, Ahad Amdani, Rob|GroktheWorld/OODA/Layer0


Slow Takes

Every Monday at 12:45 pm BST I co-host Slow Takes, a live conversation covering the week’s AI news without the hype. The recording is posted the same day. All episodes are available on the Slow AI Live page.


Free tools

These are free, no sign-up required, and open to everyone.

Bot or Not. 10 quotes. Half written by humans, half by AI. Can you tell the difference? Most people score no better than a coin flip.

Dead Reference. 10 academic citations. Half real, half fabricated by AI. Each reveal teaches a specific verification method. Built for researchers, students, and librarians.

Integrity Debt Audit. Paste an assessment brief and get a free report scoring it against 10 categories of AI vulnerability. Built for educators and programme leaders who want to fix their assessment design rather than blame students.


Who am I?

I’m Dr Sam Illingworth, a Full Professor of Creative Pedagogies based in Edinburgh, Scotland. I have a PhD in atmospheric physics, over 125 peer-reviewed publications, 10 books, and a career spent at the intersection of science, communication, and education.

My open-access book GenAI in Higher Education (Bloomsbury Academic, 2026) was downloaded 10,000 times in its first two weeks. I am a member of REF 2029 Sub-panel 23 (Education), a Principal Fellow of the Higher Education Academy, and a Fellow of the Young Academy of Scotland.

My work has been featured by the BBC, Nature, Scientific American, and The Conversation. I also lead the largest UK study into how students actually use AI tools.

I started Slow AI because most of the advice about AI is wrong. Not because the tools are bad, but because nobody is asking what we give up when we use them.”


Not sure where to start?

New here? Start with these posts:

What is Critical AI Literacy - An exploration of how to critically engage with AI.

AI cannot be your friend - An inquiry into how AI is a tool masquerading as a companion.

Your AI is shaping your voice more than you think - How to observe the feedback loop between your language and the tool.

Ready to join? Subscribe for free to receive regular posts on critical AI literacy, or upgrade for the full curriculum and certificate.


You are very welcome here.

Go Slow,

Sam


All views expressed here are my own and do not represent those of my employer.

User's avatar

Subscribe to Slow AI

Knowing when to use AI and when to leave it the hell alone.

People