PROLOGUE: "Cinema Novo Did Not Ask Permission Either" (0:00-2:00)
Sao Paulo. On the road to Cannes.
A festival built around the question every filmmaker in this room is already living: what happens when AI enters the frame?
You know who else asked dangerous questions about cinema? Cinema Novo. Glauber Rocha didn't have a playbook. He had a camera and something to say about Brazil. He didn't build a product. He built a movement.
I'm not here to teach you. I'm here to share what I've learned -- and to learn from you.
But first, I need to tell you something honest.
ACT I: THE LEFT HAND -- Naming What Is True (2:00-12:00)
The Non-Consensual Opening (2:00-4:00)
I find my relationship with AI completely non-consensual.
LaSalle College. January 2026. 200 art and design students. I said those words and the room went quiet.
It didn't ask me to steal my work. It didn't ask me to take over this stuff. It didn't ask me if it could change everything around me. All I'm trying to do is keep my head above water and reinvent myself.
I released thousands of photos on Flickr over two decades under Creative Commons. Shared them freely -- that was the deal. When I checked a dataset search tool, I found roughly 1,800 of my images in a major training corpus.
Nobody asked me. They just took it.
If you've ever released a film, posted a photograph, written a script, published your work online -- it may already be in training data. You weren't asked. That's not hypothetical. That's what happened. And everyone in this room knows it.
Three Fears That Are Real (4:00-6:30)
Not hypothetical fears. Real shit happening right now.
One: Stolen work without consent.
Training data equals humanity's creative output scraped without permission. Your photos. Your writing. Your scenes. Used without asking. No compensation. No attribution. Call it what it is.
Two: Race to the bottom.
"Good enough" AI outputs devalue human craft. Clients are already saying it out loud: "Can't you just use AI for this?" Translation: "Why should I pay you when a tool can approximate it?" That pressure is messing with people's livelihoods.
Three: Junior pipeline collapse.
Entry-level jobs are vanishing. The roles that trained the next generation are disappearing. How do people become directors if they can't be assistants first? How do cinematographers learn their craft if nobody hires camera assistants anymore? Every experienced filmmaker in this room benefited from a pipeline that's collapsing under their feet.
I teach AI for a living. I'm going to be honest with you: I don't have the junior pipeline problem solved. Nobody does.
Name What You're Seeing (6:30-8:00) -- The Framing Problem
And there's a fourth thing we need to talk about. What they built with our stolen work.
I wrote something about this recently that I keep coming back to. We've said the word "bias" so many times it's lost meaning. Algorithmic bias. Training data bias. Gender bias. Generic. Safe. Abstract.
Stop saying bias. Name what you're seeing.
When a colleague of mine asked an image generator to create a "marketing professor," it gave her a man in a power pose -- Harvard blazer, authority, gravitas. Then she asked it to make the professor female. She got a timid school teacher. Not the same authority. Not the same context. A diminished version.
It took her multiple rounds of prompting to get a woman with the same authority the system gave a man by default.
That's not bias. That's misogyny embedded in image generation systems. Name it.
When facial recognition fails on darker-skinned women at 40 times the rate it fails on lighter-skinned men -- 34.7% error versus 0.8% -- that's not a "performance gap." That's racism and sexism embedded in code. Dr. Joy Buolamwini proved it. She calls it the coded gaze: AI systems reflect the gaze of their creators.
You're filmmakers. You understand this instinctively. What's missing from the frame matters as much as what's in it. Every shot is a decision about who's included and who's erased. Training data works exactly the same way. What data gets collected? Who's represented? Who's missing? Whose experience becomes the baseline?
Those aren't technical decisions. They're values decisions.
I call it bias laundering -- discrimination that looks like math. "The algorithm said so" carries weight that "my gut feeling says so" doesn't. But when you train that algorithm on centuries of discrimination, you're not creating objectivity. You're automating inequity.
One more thing to carry in the left hand. Don't drop it.
The 55% (8:00-9:30)
Here's a number that changed how I think about all of this.
Gallup data: 55% of US workers have never used AI at all. Only 12% use it daily. Half the workforce is still on the sidelines.
Think about what that means. If all the thoughtful, ethical, critical people opt out of AI -- if they decide it's too problematic to touch -- governance gets made by opportunists and true believers. The people who understand the risks best aren't in the room when the decisions get made.
That's worse. That's worse than engaging with critique intact.
The First Protester (9:30-11:00)
I had my first protester at a keynote recently. Back row. Video camera rolling. Took the mic during Q&A. Wouldn't give it back. Made YouTube soundbites.
And here's the thing -- she's right to be concerned.
If you're not 20 years old, pissed off, and challenging things, you're not really paying attention to what's going on right now.
Be angry. That anger is legitimate. Channel it.
Just don't let anger be the only thing you hold.
Holding the Left Hand (11:00-13:00)
This is what I carry. Stolen work. Devalued craft. Collapsing pipelines. Discrimination laundered through math. A relationship I never consented to.
Don't drop these. They keep you honest.
Now let me show you what's in my right hand.
ACT II: THE RIGHT HAND -- What Is Also True (13:00-29:00)
What's also true. Not hype. Not corporate optimism. Actual transformations I've witnessed with my own eyes.
Luke Minaker: The 80/20 Flip (14:00-18:00) -- Centerpiece Story
You're filmmakers. You know three-act structure. Let me tell you a story in three acts.
Act One.
A kid named Luke. Twelve years old. Sitting in a theater watching Jurassic Park for the first time. He sees those velociraptors come alive and something clicks. He turns to his mom and says: "This is what I want to do. I want to bring monsters to life."
Act Two.
Thirty years later. Luke is an animator. But he's not bringing monsters to life. He's making cheap plastic gags to sell cheap plastic toys. Barbie commercials. Corporate animation. Cog in the machine.
The dream from that movie theater? Buried under 30 years of grind.
The ratio: 80% grind, 20% creative decisions. That's what the industry gave him.
Act Three.
AI tools arrive. And the ratio flips.
80% creative. 20% grind.
A year's worth of traditional production -- completed in two to three weeks.
Luke's studio, Tiny Ghost, started two years ago with zero revenue. By January 2026, they were fielding calls from household names. Their original IP "Blood and Glitter" just got greenlit and funded -- the strongest AI animation IP currently in production.
But here's the line that got me. Luke said: "I don't care about the money. I am so excited that I get to do art again."
The 12-year-old who wanted to bring monsters to life? He's back. After 30 years.
That's not replacement. That's liberation.
Kevin Friel: The Conductor (18:00-21:00)
Kevin Friel. Twenty-five years in Hollywood visual effects. Dune. Detective Pikachu. The guy has worked on things you've seen on the biggest screens in the world.
Kevin told our Film Club something that reframed everything for me. He said: "My mum took me to the National Arts Center in Ottawa starting at age six to see the symphonic orchestra play."
And then he made the connection.
"The conductor is still important hundreds of years into classical music. It's the role of knowing enough about all of the parts, but bringing out the best of the sum of the parts."
One agent writes the script. Another generates scenes. A third scores the music. A fourth handles continuity. A conductor orchestrates them all. That's the orchestration era. It's already happening.
Kevin's advice: "Get it 85% of the way there with AI, then use craft to finish."
That last 15% -- the human judgment, the aesthetic choices, the "this scene needs to breathe here" decisions -- that's where the art lives.
Taste and judgment built over 25 years is what makes output art, not noise. The tools don't replace the conductor. They give the conductor a bigger orchestra.
Belgium: "AI Is My Hope to Continue to Exist" (21:00-24:00) -- The Gut-Punch
At our second Film Club meeting, an actress who goes by Belgium stood up. She'd been bedridden for 18 months. Couldn't work. Couldn't perform. Couldn't do the thing that made her who she is.
During those months, she started making AI avatars. Creating characters. Building stories from her bed.
She said: "AI film is my hope to be able to continue to exist and to tell something. I feel like I start from scratch in some ways. And AI is going to be my hope to be able to continue to exist."
For some people, AI tools are not about getting ahead. Not about efficiency. Not about flipping ratios.
They're about getting to participate at all.
Don't forget that. When the debates get abstract and ideological, remember Belgium. Remember what access actually means to someone who almost lost their voice entirely.
The Transformation Pattern (24:00-26:00) -- Quick Montage
This pattern repeats across every cohort I teach.
Larissa. Book editor. Showed up with what she called "an almost hostile attitude toward AI." By week three, she was building tools she'd never imagined.
Louise. Writer. Laid off from Brown University. Reinventing herself. Described AI as "a colleague who has their problems but actually comes up with great ideas."
Sam. Resistant at first. By the end, she'd cut her production time AND raised her prices. First person in four cohorts to do both.
Kara. One year out of college. Her school enforced a ban on AI in creative work. She showed up and said: "I want to learn as much as I can."
Fear. Discovery. Integration. Identity shift. The pattern repeats.
The Both/And Statement (26:00-27:30) -- Emotional Pivot
My personal testimony:
AI is trained on stolen work without consent. And I'm more creative, more productive, and more powerful than I've ever been in my whole life.
Both statements are true.
I can critique the theft AND use the capability. I can resist extraction AND build with the tools. I can be pissed about non-consent AND leverage the moment.
Not contradiction. Complexity.
Holding the Right Hand (27:30-29:00)
Don't drop this either. Don't drop it for ideological purity. Don't drop it because the critique is easier to hold.
The transformation is real. People's lives are changing. Luke got his art back. Kevin found a new way to conduct. Belgium found a way to exist.
The capability explosion is happening. You can pretend it's not. Or you can engage with it critically.
ACT III: THE THIRD PATH -- Walking Forward (29:00-39:00)
The Binary Trap (29:00-31:00)
Most conversations about AI force you to pick a side.
The Boosters: "Embrace it fully! Stop being afraid! You're a luddite if you don't!"
The Doomers: "Resist completely! Don't participate! You're complicit if you do!"
Both feel clean. Both are incomplete.
You're filmmakers. You know the binary is lazy writing. Two-dimensional characters don't move anyone. The interesting choice is always the third option -- the one that's harder to hold.
Proof Points (31:00-33:30)
Let me show you what the curiosity hand actually builds.
Vibe Coding.
There's a movement called vibe coding -- building technology by describing what you want in plain language. You say "build me an interactive timeline showing these events" and the AI handles the code.
When I taught this through the Google News Initiative AI Lab, a journalist named Julie built a fully interactive landing page in about an hour. No coding experience. Zero.
Here's what's wild: creative professionals consistently outperform developers at vibe coding. Because the bottleneck shifted. It's not about syntax anymore. It's about knowing what humans need. Understanding user experience. Having taste.
That's the art school dream recaptured. Your creative training isn't obsolete -- it's suddenly the most valuable skill in the room.
Knowledge Bases.
Every AI assistant in the world runs on two things: a system prompt and a knowledge base. That's it. The system prompt tells it how to behave. The knowledge base tells it what to know.
Build a knowledge base that contains your aesthetic, your references, your voice, your process -- and you have an AI collaborator that actually understands your work. Not a generic tool. Yours.
That's sovereignty over your own data. Not renting someone else's intelligence -- building your own.
The Brazilian Opportunity (33:30-36:00)
Cinema Novo was ecosystem building before anyone called it that. Glauber Rocha didn't just make films. He built the conditions for a generation of cinema. Critics, musicians, writers, filmmakers -- all creating a cultural shift together. Not a product. A movement.
You're part of that tradition.
AI tools lower barriers. International platforms want Brazilian stories. The infrastructure is here -- film schools, festivals, Rede Globo, independent studios. And now the tools of production are radically democratized.
But here's the thing. Parasite worked because Bong Joon-ho made a deeply Korean film. He didn't try to be universal by removing Korean specificity. The specificity WAS the universality.
Kleber Mendonca Filho builds from Recife, not Hollywood. Anna Muylaert tells Sao Paulo stories. Karim Ainouz combines Ceara roots with international reach.
The more specific, the more universal. Don't copy Hollywood. Build from Brazilian culture.
Your perspective is training data no model contains.
The Soul Dimension (36:00-39:00) -- Bass Coast Energy Enters
The real question is not "How do I use AI tools?"
The real question is: What do you refuse to outsource?
When AI can handle the calendaring, the accounting, the rough cuts, the continuity notes, the color grading suggestions, the temp scores -- what remains?
Here's my list. The things AI can't eat:
Laughing until you can't breathe. Crying at a film that sees you. Kissing someone for the first time. Getting lost in the woods and not caring. Dancing until your atoms rearrange.
These are not features. These are not outputs. These are the reasons we make films in the first place.
AI is a mirror. It shows us who we've been -- our patterns, our discrimination, our historical inequities -- and amplifies them at planetary scale. If you bring vision and cultural depth, the mirror gives you cinema. If you bring nothing, it gives you content. If nobody checks whose face the system fails to see, the mirror just automates the erasure.
The mirror doesn't lie. But it only shows what we put in front of it.
Feed your soul, not the machine.
CLOSING: "You Coming?" (39:00-44:00)
The Choice (39:00-40:30)
Three options.
One: Embrace uncritically. Become a booster. Ignore the harms. Participate in extraction. Feel clean about your enthusiasm. Look back in ten years and realize you were complicit in something harmful because you refused to look at the critique.
Two: Resist completely. Become a doomer. Maintain purity. Lose influence. Feel clean about your principles. Look back in ten years and realize you had no voice in how things unfolded because you opted out of the conversation.
Three: Walk forward with both hands full.
The Stakes (40:30-42:00)
You are 600 people at the first edition of a festival bridging Brazil, France, and Canada. You are not attendees. You are founding members. What you build here shapes AI filmmaking for your generation.
Not through manifestos. Through the work you make. The communities you build. The ecosystems you tend.
That's not small. That's Cinema Novo energy. That's building a movement, not attending a conference.
The Invocation (42:00-44:00)
Make things. Not one perfect film -- many experiments.
Find five people. Start a group chat. Watch each other's work. Give real feedback. Not polite feedback. Real feedback.
Hold critique in one hand. Hold curiosity in the other. Keep walking.
Feed your soul, not the machine. Stay specific. Stay Brazilian. Stay human.
I'm here. Both hands full. Walking forward.
You coming?
Obrigado.
Q&A PREP
"How do you reconcile using AI when it's trained on stolen work?"
I don't reconcile it. I hold the tension. I'm not okay with the theft. I also recognize that opting out completely means I have no voice in shaping how AI develops. So I engage critically -- I've trained my own model on my own 2,000 cross-processed portraits, presented at Ars Electronica. I teach consent frameworks. I advocate for artist-owned training data. But I also use the tools that exist now because waiting for perfect ethical AI means waiting forever and losing all influence in the meantime.
"Isn't Both Hands Full just fence-sitting?"
No. Fence-sitting is refusing to take a position. Both Hands Full is taking two positions simultaneously. I have very clear positions: theft is wrong. Transformation is real. I'm not neutral. I'm dual. That's not weakness -- it's the sophistication filmmakers already understand. Every great film holds contradictions. Every great character contains multitudes.
"What about the junior pipeline? You said you don't have it solved."
I don't. And I won't pretend otherwise. What I can say: the mentorship that used to happen through assistant roles needs to find new containers. Community-based learning. Cohort programs. Filmmaker collectives where experienced people bring up the next generation. The mechanism changes but the need doesn't go away. If you run a studio, build those containers. If you're early in your career, invest in relationships with experienced filmmakers. Those relationships will matter more than any job title.
"What specific AI tools should filmmakers learn?"
The tools change every six months. What doesn't change: understanding workflow integration. How does this serve the story? Does this free time for creative thinking? Current useful categories: script development (ChatGPT, Claude), video generation (Runway, Kling), audio and music (Suno, ElevenLabs), editing assistants. But learn the principles, not the product names. The products will be different in six months.
"Does using AI make my work less authentic?"
Using a camera doesn't make photography less authentic. Using editing software doesn't make your film less real. Tools are tools. Your vision, your choices, your intention make it authentic. The question is: are you using tools with intention, or are you letting them dictate your aesthetic? If you bring vision, you get cinema. If you bring nothing, you get content.
"What about AI bias and representation in film?"
It's not bias. Name what you're seeing. When an image generator gives a male professor authority and a female professor timidity by default, that's misogyny. When facial recognition fails on darker-skinned women at 40 times the rate of lighter-skinned men, that's racism embedded in code. Dr. Joy Buolamwini proved it with the Gender Shades study. I call it bias laundering -- discrimination that looks like math. As filmmakers, you understand framing. What's missing from the frame becomes what's missing from the system. Same principle, planetary scale.
Backup: "What about environmental impact?"
Real concern. Training large models burns significant energy. Worth advocating for transparency from AI companies on carbon footprint. Worth supporting research into more efficient models. One more thing to carry in the left hand -- don't drop it. But also don't let it become the reason you opt out of shaping how this technology develops.
TIMING NOTES
| Section | Duration | Cumulative |
|---|---|---|
| Prologue | 2:00 | 2:00 |
| Non-Consensual Opening | 2:00 | 4:00 |
| Three Fears | 2:30 | 6:30 |
| Name What You're Seeing | 1:30 | 8:00 |
| The 55% | 1:30 | 9:30 |
| First Protester | 1:30 | 11:00 |
| Holding Left Hand | 2:00 | 13:00 |
| Luke / 80/20 Flip | 4:00 | 17:00 |
| Kevin / Conductor | 3:00 | 20:00 |
| Belgium | 3:00 | 23:00 |
| Transformation Montage | 2:00 | 25:00 |
| Both/And Statement | 1:30 | 26:30 |
| Holding Right Hand | 1:30 | 28:00 |
| Binary Trap | 2:00 | 30:00 |
| Proof Points (Vibe Coding + KB) | 2:30 | 32:30 |
| Brazilian Opportunity | 2:30 | 35:00 |
| Soul Dimension | 3:00 | 38:00 |
| The Choice | 1:30 | 39:30 |
| The Stakes | 1:30 | 41:00 |
| The Invocation | 3:00 | 44:00 |
Total: ~44 minutes. Tight but doable. If running long, trim transformation montage to one sentence.
STAGE DIRECTION SUMMARY
- Prologue: Center stage. No slides. Warm, grounded.
- Act I: Stage left. Left hand physically raised at key moments. Heavy energy. Slower pacing.
- Act II: Stage right. Right hand raised. Energy lifts. Stories carry the weight.
- Both/And pivot: Center stage. Both hands up. The emotional core.
- Act III: Center stage, moving freely. Building energy. Brazilian pride section gets warmer.
- Soul Dimension: Stillness. Lower register. Intimacy in a room of 600.
- Closing: Center stage. Performed. Benediction energy. Hold silence at end.
Word Count: ~3,800 words Speaking Pace: ~86 words/minute (slower than normal for weight and translation pauses) Version: 1.1 Created: 2026-02-25 Updated: 2026-02-25 Status: Draft for review -- now includes "Name the Bias" material
