Sam Illingworth from Slow AI: Building a Portfolio Career Around Critical AI Literacy✨The Career Pivot Playbooks Series
Sam Illingworth from Slow AI on translating academic expertise into public impact through Substack. Part of Career Pivot Playbooks: a public archive of modern career blueprints.
This Career Pivot Playbook features Sam Illingworth, a Full Professor of Creative Pedagogies, poet, and founder of Slow AI, a Substack publication helping professionals develop critical AI literacy.
Sam’s career didn’t pivot overnight. It evolved through academia, science communication, poetry, and pedagogy… consistently bridging expertise and public understanding.
That emphasis on public understanding would later become central to how Sam approached AI.
Rather than abandoning academia, Dr Sam Illingworth expanded beyond it. Substack became a way to extend research into public discourse, build community around learning, and reach those most affected by rapid AI adoption.
This Playbook explores how expertise, when paired with public writing and consistent engagement, can become the foundation of a modern portfolio career.
✨About Career Pivot Playbooks
A public archive of modern career blueprints…
Most careers no longer follow a straight line.
People pivot gradually.
They extend their work beyond institutions.
They combine roles, platforms, and income streams.
Career Pivot Playbooks is a weekly series curated by Katharine Gallagher documenting how professionals build resilient, future-ready careers by turning existing skills into income, often combining Substack with consulting, teaching, research, creative work, or business ownership.
The focus isn’t outcomes.
It’s about how careers are shaped in practice.
About Sam
Sam Illingworth is a Full Professor of Creative Pedagogies, poet, and founder of Slow AI, a Substack publication helping professionals develop critical AI literacy. His work focuses on when to use AI and when to leave it the hell alone.
On Substack, Sam writes thoughtful essays on AI, education, and judgment. You can follow his work here →
Sam reflects:
“What inspired you to make a career pivot, and how did you know it was the right time?”
The realisation wasn’t dramatic. It was gradual, like watching a tide come in. When ChatGPT arrived, I saw colleagues panic, students scramble, and institutions flail around trying to write policies. But I also saw something else: an opportunity to explore what this meant for diverse communities, particularly in higher education where I’d been working for years.
I pivoted a significant portion of my research toward AI and its implications for universities. I was fortunate to secure research grants, including Leverhulme funding, to explore these intersections properly. That research led to my promotion to Full Professor of Creative Pedagogies and eventually to co-authoring GenAI in Higher Education with Rachel Forsyth, published by Bloomsbury Academic as an open-access book in January 2026.
But here’s what surprised me: the more I researched AI in academic contexts, the more I realised the conversation couldn’t stay locked in peer-reviewed journals. Academia operates in a publish-or-perish paradigm. I do that too. I’ve published over 100 peer-reviewed articles and written multiple books. But I started asking myself: what’s the actual impact? My research might appear in good journals, but who reads it? A few hundred specialists if I’m lucky.
Meanwhile, the people who genuinely needed critical AI literacy (teachers, professionals, parents, anyone feeling overwhelmed by adoption pressures) weren’t reading academic journals. They were online, searching for guidance that went beyond hype or fear-mongering. That’s when I started Slow AI on Substack in July 2024. Not as a side project, but as a deliberate attempt to break down the barriers between academia and the rest of society. This is what the purpose of education should be.
“Which skills from your previous roles have been most valuable in your new career or business?”
Everything I’d learned about science communication became unexpectedly relevant. My years developing poetry workshops taught me how to make complex ideas accessible without dumbing them down. The theatrical training from Japan gave me an understanding of audience, narrative, and how to hold people’s attention. My work on student engagement showed me how to build community around learning, not just broadcast information.
But the most unexpected advantage? My physics background. It gave me a certain comfort with uncertainty and probabilistic thinking that translates directly to understanding how large language models actually work. I’m not a developer or an engineer, but I can read the research, understand (some of) the mechanisms, and translate that into frameworks people can use. That scientific literacy combined with creative communication skills turned out to be helpful.
“How did you discover Substack and why did you choose it as your primary engine for growth?”
I chose Substack deliberately because I see it as the university for AI. It’s where people go to learn, to think critically, to engage with ideas beyond the soundbites and the hype cycles. Traditional universities have a crucial role, but they’re not always accessible to everyone who needs this knowledge right now.
Substack made sense for several other reasons too. The readership potential is significant. I get about 70,000 views every 30 days on Slow AI. That’s more than people will ever have read my 100+ peer-reviewed articles and multiple books combined. The platform also allows me to build direct relationships with readers in a way traditional publishing never could. I reply to every comment. People respond to that level of engagement.
The visibility strategy was straightforward: consistent publishing (2-3 posts weekly), active presence on Substack Notes, strategic use of LinkedIn, and leveraging Recommendations from other Substackers. I’m honestly terrible at marketing in the traditional sense, but I try to show up consistently and provide genuine value. That seems to be working, though I’m still learning.
“Who is your audience, and what value or offerings do you provide for them?”
My work is for professionals who feel overwhelmed by AI adoption pressures and don’t know where to turn for honest guidance. Teachers who are told they must integrate AI but aren’t sure when that makes sense. Mid-career professionals facing pressure to automate their workflows. Anyone who suspects the “move fast and optimise everything” narrative isn’t the only story worth telling.
The problem I’m trying to address is this: most AI instruction prioritises technical optimisation and extraction. It’s all about throughput (how to generate 100 emails or 50 images in a minute). What’s missing is judgement. Critical discernment. The ability to know when to use AI and when to leave it the hell alone.
That’s what people seem to value. The free content establishes the framework, but the paid tier (the twelve-month Critical AI Literacy curriculum) gives people structured learning, monthly webinars, and a community of practice. I launched the paid tier in January 2026, and the response has been encouraging, though I’m still figuring out how to do this well.
“What were some of the highs and lows of starting over or launching this venture?”
What was harder than expected: the sheer volume of content creation required to build momentum. Writing 2-3 substantial posts in the evenings and weekends whilst also having a young family can be exhausting. I definitely underestimated that.
What was easier: building the audience, at least initially. I launched with zero subscribers in July 2024. By January 2026, we’re at over 6,600 subscribers. That growth validated something important: there’s genuine appetite for critical, thoughtful AI guidance that doesn’t treat you like an idiot or a sceptic who needs converting.
The paid curriculum launch taught me that people will pay for structure and community, not just content. What really seems to drive engagement is the promise of monthly live webinars and peer learning. People are hungry for spaces where they can think together, not just consume alone.
“What advice would you give to someone considering a similar pivot or looking to get started on Substack?”
Substack has been remarkably generative once I stopped overthinking it. My advice: publish consistently, reply to every comment, use Notes strategically but don’t get lost in it, and remember that quality compounds over time.
Don’t try to game the algorithm. There isn’t much of one to game anyway. Just show up, provide value, and trust that the right people will find you. The recommendation system works if your content is actually good. Several of my largest subscriber jumps came from other writers recommending Slow AI to their audiences.
Also: the About page matters more than you think. Make it clear what you’re about and why someone should care. I spent real time on mine, and it seems to help with conversion rates.
“What challenges did you face during your career pivot, and how did you overcome them?”
What nearly made me quit: honestly, just the exhaustion. Maintaining this level of output while doing everything else is relentless. There were moments when I wondered if I was spreading myself too thin, whether the effort was worth it.
What kept me going: the conversations. I reply to every comment on Slow AI, and those exchanges reminded me why this mattered. Teachers writing to say the prompts helped them think more clearly about classroom use. Mid-career professionals thanking me for permission to slow down. Those human connections made the effort worthwhile.
The validation helped too. When people choose to pay for the curriculum, when they show up to webinars, when they engage thoughtfully in the community, that’s evidence you’re solving a real problem for real people.
“What advice would you give to someone considering a similar career move?”
Start before you feel ready. I launched Slow AI with no subscribers and mediocre certainty about what I was doing. But starting created momentum that thinking never would have.
Don’t write for other academics if you’re trying to reach a broader audience. I spent years writing for peer reviewers, and it’s a completely different skill set than writing for actual humans who need practical help. Learn to write conversationally. People can tell the difference between “translated from academese” and “genuinely written for them.”
And this: you don’t need to choose between credibility and accessibility. My professorship gives me a foundation of expertise, but Slow AI gives me reach. They work together. The point is to use your expertise to serve people who wouldn’t normally have access to it. That’s what universities should be doing anyway.
“How has your work evolved into a portfolio career, and how do these different strands now reinforce each other?”
Beyond Substack, I’m pursuing CPD accreditation for the Critical AI Literacy curriculum to enhance its credibility and value in professional development contexts. I have a literary agent who’s been advising me on next steps, and we’re exploring book possibilities.
I’m also building speaking opportunities around the critical AI literacy framework and developing consultancy work. Everything feeds back into everything else. The research informs the Substack. The Substack conversations reveal gaps in the research. The consultancy work surfaces real-world problems that become curriculum content. It’s a genuine portfolio, not just disconnected streams o consciousness.
“When you look ahead, what role do you hope this work plays, and how are you thinking about its future growth?”
As I said above, I think Substack functions as the university for AI right now. It’s where people go to learn how to think about these tools, not just how to use them. That matters. Universities have a role, but they’re not always accessible to everyone who needs this knowledge immediately.
What’s next: continuing to build Slow AI into a resource people trust. The CPD accreditation will hopefully open institutional markets. And I’m thinking carefully about how to scale this work without losing the personal touch that makes it valuable.
Longer term, I want Slow AI to become the default resource people turn to when they need to think critically about AI adoption. Not the loudest voice in the conversation, but hopefully one of the more trustworthy ones. That’s the work.
Links & Resources
You can find Sam’s writing, products, and ongoing work at the links below…
Website: https://www.samillingworth.com/
LinkedIn: https://www.linkedin.com/in/sam-illingworth/
Gumroad: https://samillingworth.gumroad.com/
Related Articles
The Slow AI Curriculum for AI Literacy
AI Slop (collaborative piece with Charlie Hills)
Read More
🧩Behind the Pivot
Learn how to turn your current skills and experience into income and build a portfolio career beyond a single job, so you’re more resilient and in control.
✨The Career Pivot Archive
Real-world career pivots, portfolio paths, and practical lessons from some of your favourite Substackers you can apply to your own next move.
I’m Katharine — a future-focused career strategist helping professionals build income options and stay relevant as work evolves.
🙏 I appreciate you being here and supporting this growing archive… and thank you to all the creators who contribute, it’s such a pleasure to learn from them.
❤️ Loved it? Restack 🔁 and share ✅
🤔Is there a question you wish had been asked?
Drop it in the comments, and let’s keep the conversation going.







an exemplary model to follow
this was a great one!