Hello friend.
I’ve been around the sun a couple of times now, and I thought I’d pretty much seen it all.
But I never thought I would see the day I would see when:
· A cat and a raccoon have a lightsaber death match
· Martin Luther King Jr would be promoting a Jet2Holiday
· Freddy Mercury would steal a KFC bargin bucket
· Albert Einstein would become a UFC champion
But Tuesday 30th September 2025 changed all that.
OpenAI, the world’s most famous AI company, released Sora2 and this has sent shockwaves around the world.
This is ChatGPT on steroids.
What is Sora2?
Sora2 is an AI video and audio platform that can generate lifelike, photorealistic videos from simple text prompts.
These videos are complete with motion, sound and realistic physics and require zero knowledge from the user. Download the app, and you are good to go. The only limit is your imagination.
Like ChatGPT allowed us to manipulate text at will, students can use Sora2 to create, remix, and manipulate “reality” as easily as they edit an image or write an essay.
In recent years, we have had AI video generation models, but they were poor —janky, awkward, and with visuals like a PlayStation 1 game.
But Sora2 has completely changed the field. These HD, fully realised clips are already flooding the internet, and if it weren’t for how absurd the videos were, it would be difficult to tell they are AI-generated.
Friend, this scares the living daylights out of me.
We have now officially entered a world where we can no longer trust what we see. The implications for society, truth, and well-being are enormous.
For us older folk, we know that MLK Jr. and Gandhi never made a rap album together, but for our younger, more vulnerable students, the relationship between facts and truth will be more strained than ever.
This is another body blow to the school system, and most educators are not equipped to deal with this disruption because:
- Training rarely covers AI literacy or media ethics.
- School policies lag years behind rapid technology rollouts.
- Few teachers have frameworks to address the philosophical questions that AI raises.
- There’s little clarity about how to use generative video safely in classrooms.
At this time of writing, Sora2 is only available in the US. Still, media-savvy kids with a Virtual Private Network (VPN) and a little imagination have found workarounds to create these videos.
AI safety in schools is not just the Department of Education's job. Every single person who works with children needs to be aware of and alert to the clear and present dangers this technology poses.
Here are five things we all need to consider.
#1 Reality is Now Editable
Sora 2 blurs the lines between “real” and “synthetic” media.
The days when we could watch something online and take it as fact are officially over. We are now going through a massive paradigm shift in how knowledge and evidence are created and verified.
It’s fun and games with cat and raccoon videos, but in our contentious political landscapes, a deepfake video could have horrifying repercussions for our communities and even our international security.
Students will soon live in an environment where any video can be generated, altered, or weaponised.
What this means for teachers:
Media literacy must evolve from identifying “fake news” to spotting synthetic realities.
We must now teach our students how generative models work, what makes something real and fake, and how context determines truth.
Yes, MLK Jr spoke at the Lincoln Memorial, but he never promoted a low-budget airline.
Ironically, Sora2 has made non-digital sources of information, such as textbooks and pre-internet first-person accounts, more valuable than ever.
#2 Digital Consent is Now a Thing
In the same way we talk about physical and intimate consent, we must now speak about digital consent.
Sora 2 can reproduce human likenesses with startling accuracy, which includes us.
All Sora2 needs is a one-minute video of you, and poof! You can make a digital replica of yourself.
This raises new questions about consent, ownership, and personal boundaries — especially for minors.
A student might think it’s funny to make a Sora2 video about their classmate in an embarrassing situation. But what they take as a joke, is actually a new form of bullying.
What this means for teachers:
This is way above our pay grade, but we, as individuals, now have to think about copywriting our voice, image, and likeness the same way we copyright a song or a brand.
The same way I can’t re-record Jay Z’s “99 Problems” and promote it as my own, is the same way other people can’t take my likeness and use it for their purposes.
But we can’t rely on the Government to help us – they move at the speed of a geriatric turtle.
This is where great school leadership comes in.
Schools need clear policies on student likeness, parental consent for generative media, and disciplinary measures for synthetic impersonation.
Friend, if you are a school leader, you must bring this up with the leadership team, governors and Local Authority/steering committee pronto.
Yes, you will make mistakes and the road will be bumpy, but if you get ahead of this, down the line, everyone will thank you for it.
This is not the time to bury our heads in the sand.
#3 Traditional Creativity Will Be Challenged
Generative video democratises storytelling.
For example, in the old days, if I wanted to make a film, I would have to go to film school, get backing from studios, source the equipment and find a bag of talented actors. We’re not even talking about film and distribution rights.
Now, a 16-year-old student with an idea in his bedroom can create a brilliant cinematic movie with no camera, actors, or editing software.
What does that mean for the entire film industry? Does the young person have the right to call themselves creative? Do we appreciate the art for the result or the effort?
What about everything we do, from music to art? Is it all worthless now?
These are huge existential questions that literally sit in the palm of our hands.
What this means for teachers:
I don’t have all the answers, but in my very humble opinion, we need to teach our students to shift their focus from product to process.
We need to teach our students to judge creativity not by output, but by originality, intention, and what they have learned from the mistakes they made.
But we shouldn't completely discard AI. Schools should teach prompt design, narrative planning, and responsible authorship.
We shouldn’t teach our kids to see AI as the enemy but as our creative partners. And we shouldn’t penalise them for it.
#4 Trust in Everything Will Be at an All-time Low
This platform is barely a month old, and it has gigantic implications.
This technology is likely to spark an explosion of conspiracy theories, distrust, and division within our communities.
Fake news will skyrocket.
Also, for many students entering the workforce, especially in creative fields, this may heighten anxiety about job stability.
Students will express increased apathy and frustration about the purpose of traditional education.
This ‘AI doom’ will make our kids feel more than ever that school is pointless.
If everything can look real, trust becomes the new scarcity. Sora2 deepens the problem of “truth decay” — when individuals disengage from verified information because they believe nothing is real anymore.
What this means for teachers:
As teachers, we must become a source of truth in a world of confusing information. We must show students how to check facts, compare different sources, and understand how false information spreads.
Encourage them to ask questions, double-check what they see, and be comfortable saying “I’m not sure.”
We need to get them comfortable with double-checking, even triple-checking the facts.
#5 Emotional Resilience Matters More Than Ever
All the latest research has indicated that social media 2.0 has done a number on our kids.
Apps like Sora2 will only increase that trend.
The psychological impact of hyper-realistic AI media is grave. Constant exposure to false digital realities can disrupt our young people’s identities, fuel anxiety, and destroy empathy and trust.
For many young people, while they are more digitally interconnected than ever, this fake reality distorts what it means to be human and takes away what humans really need: truth, understanding and gathering in the meat space, aka real life.
What this means for teachers:
We need to tell our students to put down their screens and go touch some grass.
Not to sound like a hippy, but we need to teach our students to be more in tune with their physical bodies and interact with their local communities.
We need to make face-to-face conversations (without phones) fashionable again. Skills like empathy, active listening and conflict resolution will be absolutely golden in this AI world.
We can foster this in our classrooms by being inclusive, practising emotional first aid and being bloody decent human beings.
In Conclusion:
Platforms like Sora2 are EXACTLY the reason why I started the Teach Outside the Robot newsletter. Because I knew that by the time you got this information through the traditional channels, it might be too late.
‘Forewarned is forearmed’ and all that.
While we can’t stop OpenAI or change governmental policy around disruptive technologies, we still have a massive role to play.
Your ability to lead, empathise and teach your students is needed now more than ever. Your classroom can become a safe harbour in these stormy times, helping your students feel they can move forward and thrive despite external challenges.
If you can do that, you’ve done more than enough.
If you are still unsure what to do, I discuss AI, technology, and leadership in my second book, ‘The Action Hero Teacher 2: Teachers of the Lost Class.’
Three years ago, I saw where this ship was heading, and I outlined what effective classroom leadership would look like in the AI age. You will learn a mix of human psychology, stagecraft and leadership strategies that will help navigate these tricky times.
If you like these newsletters, you’ll love the book. 😎
Have a gander when you are ready.
That’s all for today.
The next TOTR edition comes out on Thursday 13th November 2025.
Until then, take care.
Karl