SPOILER alert:This interview contains spoilers from “Chapter 7: Retreat,” thefinale of “A Murder at the End of the World,” now streaming on Hulu.
Like an Agatha Christie novel rebooted for the age of ChatGPT, at the end of “A Murder at the End of the World,” the AI butler did it.
The revelation that Ray (Edoardo Ballerini), the Siri-style digital assistant to billionaire Andy (Clive Owen) has been masterminding the deaths at their snowbound retreat was only one of the startling conclusions of FX’s series, which concluded Dec. 19. (All episodes are now available to stream on Hulu.) Darby Hart (Emma Corrin) came to understand the origin of the killings, carried out by Andy’s young son Zoomer (Kellan Tetlow) as part what he naively believed was a game with his digital companion. The realization was spurred in part by seeing that her late ex Bill (Harris Dickinson) had circled the phrase “faulty programming” in her book as he died; programming doesn’t come much faultier than Ray, designed to help the guests but impelled to misguidedly protect his creator from perceived intruders.
In a conversation in October for a PvNew profile, “A Murder at the End of the World” creators Brit Marling and Zal Batmanglij ran through the resolutions of both of the show’s twin crimes, both the serial-killer plotline in flashback and the AI-enabled spree in the present day. Both crimes involved protagonist Darby, and they also spoke about Corrin’s casting, as well as the ways in which early access to ChatGPT helped make their story feel true-to-life.
I want to talk about the circularity to the journey that Darby has been on — both in flashback and then in the present, where she solves the mysteries in both timelines. What were you trying to achieve by delaying the reveal that she did solve the serial killer case in the past until deep into the series?
Brit Marling: It’s funny, because you used the word “circular,” and we built a circular hotel on purpose. We’re trying to think of time in a circular way on purpose. Thinking about Darby wandering the corridors of that circular hotel as wandering the hands of the clock of capital-T Time, the ultimate mystery!
There was something about carving those handoffs in a way that felt more true to us about how time actually functions, which is not nearly as a linear thing. Darby is in the middle of investigating something in the cold and snow of Iceland. And she feels defeated. And that defeat sends her back to a moment in the past when she felt similar defeat, but then she persevered through something and felt a sense of triumph. We talk a lot about the past informing the present, but we don’t talk about how the present animates the past. When you re-remember something, you’re reshaping time. We wanted to create that feeling.
And it felt like if we were doing our job correctly by braiding these narratives, you would see that the serial killer in the past is also connected to the code that animates Ray. When we build AI that is coming from the data set that has preceded, it’s taking in and ingesting that sociopathic behavior, that misogynistic behavior, the racism, the homophobia, and it’s building code that reanimates those ideas. If we did it right, you’d see that Darby catching the serial killer in the past is very connected to catching the threads of the same behavior that’s animating all of us now through the algorithms that animate our lives.
Zal Batmanglij: Her solving the case is so connected to her losing love, or turning her back to love. She chose her work over her love.
That’s why her reunion with Bill early in the series is so charged — because she was so deep in the case that he walked away.
Batmanglij: She’s so in it, and she didn’t realize it. When the story starts, six years later, she doesn’t know that she did that. She has to go through the stress, to revisit that and see it anew. It’s one of the reasons I’m drawn to mysteries, is to have the bravery to revisit your insights — your own dark chambers that you don’t want to look at. When Darby does that, she’s freed to be able to see things in a different way, and we can have our ending.
Ray, the AI servant, is infused with all of the worst of us — and, I think, a lot of Andy, the Clive Owen character. But is there a world where he could have been benevolent, a helper and brought us further along in our understanding of the world?
Marling: I think about this so much. So much of this depends on who’s programming it, and what’s the intention. We’re locked in this system where, unfortunately, all anybody can really think about is how to make profit the next quarter, and how to beat competition to market.
I was reading the Financial Times, an interview with Ted Chiang. And I was so moved by how he put this, because I think he really understands AI. He was like, “We’re already in a state of late-stage capitalism — it’s very troubling and detrimental to the environment and all of our health. Add AI to that, and it becomes a force multiplier for everything that’s not working.”
If you think about AI in a vacuum — a science-fiction vacuum — there are so many dazzling ways in which it could come to life. But I don’t know that we’re yet seeing that, because so much of what we’re given permission to imagine or the financing to achieve is only in terms of what would be profitable.
Batmanglij: I was with an AI designer recently, and he was saying that, at their company, a major tech company, they would always have safety guards — they would have testing, have legal look at things. But the arms race for AI is going so fast that they’re just releasing things without testing. It’s stuff that is so terrifying.
Marling: Because it makes us the test subjects. We are the lab rats.
I found the Andy character very sad, because he seems to have every misguided impulse about how to use technology. Every impulse he has pushes him to use technology to isolate himself from humanity. It’s so sad to look at a character who’s a genius and think, all this intelligence went towards building a fortress and a robot butler who murders people.
Marling: Imagine if all of that intelligence had gone in service of humanity’s collective goals, what you could achieve them. You can tell that Andy — all of the early trauma in his life put him in a state of such insecurity and fear that the channeling of his intelligence became about buttressing himself from his anxieties about the world, in a very selfish, self-centered way. The question is: Can all of us collectively lift ourselves out of our self-centeredness and try to imagine the thrust of our lives and our considerable amassed intelligence being toward a more collective aim of the health of the planet?
Batmanglij: Or even the health of his family. What’s so sad is that his corrosive elements corrode even the thing that’s most valuable.
I thought the performance that Edoardo Ballerini gave as Ray was pretty incredible. I really believed him as an answer to Alexa or Siri. How did you find him, and how did you write for him?
Marling: Zal came in one day in the writing phase and was like, “I know who the AI is.” And he played me a tape of a New Yorker profile that Edoardo had read aloud. I saw what he was seeing: Edoardo has this incredible diction that feels real and unreal. The moment we met him, we did some tests, and this is a testament to Edoardo’s talents as an actor — he studied all these different video games to figure out ways to move in ways that are just a little bit off. And yet the humanity is there and present; you feel it in the eyes.
Batmanglij: 2019 is when we came up with this idea, and 2020 was when we were writing it, and Edoardo would keep me company, because I would listen to all these audiobooks or articles he would read aloud. We didn’t want to present technology as this blanket evil, black or white. Edoardo is who I would want as my—
Marling: —AI assistant!
Batmanglij: There’s something about him that is…
Marling: It’s an almost perfection of diction, of articulation. It has the uncanny feeling of being not quite human. It’s so good. How can it be so good?
Batmanglij: And then, Brit and I had an invitation to each other that we don’t make the technology fake. The AI isn’t evil. It’s not sentient. It’s not even aware. We, the audience, are anthropomorphizing and giving meaning, but it’s machines. Our friend Moxie [Marlinspike], who’s this amazing tech guru who invented [encrypted messaging system] Signal is a buddy of ours from the olden days. He got us access to [Chat]GPT three years ago. We would ask questions. And we did that initial conversation between Ray and Darby. We would give it parameters, and it would make little mistakes. And we put those in there. Because Edoardo is doing it, you think it’s a real conversation, but it’s actually an AI conversation from years ago.
I want to ask you about casting Emma Corrin. I cannot imagine a role less like Princess Diana — which is the only other thing I’ve seen them in so far — than this one.
Marling: For some reason, in the pandemic, Darby sprung up. Darby, as a character, was a way of talking about a lot of what was happening in that time and the things that we were feeling. Darby, from the beginning, existed so robustly between us, but was very hard to capture on the page and communicate to other people. We would take things apart and start from scratch — and then we cracked it. And a very strange thing happened. We’ve never had this before with our scripts. People wanted to read the whole thing. Everybody emerged from reading being like, “Darby is iconic.”
And when we met Emma for the first time, over Zoom, Emma just had this energy and this self-possession and this clear-eyed gaze that just felt Darby. On set, it was so challenging, physically, emotionally, psychologically, spiritually, to be in every scene every day. I only realized the extent to which Emma became Darby when I saw them a couple months after shooting, and we met up at a vintage store to go shopping. When we hugged, everything about it was Emma again. Everything about how Emma carries themself. I was like, “Oh, we’ve spent a year with Darby, with another person.” That’s a kind of acting that’s happening on a molecular level, that’s about cellular transformation.