10/23/2025 | News release | Distributed by Public on 10/23/2025 22:17
Dream utilizes Random Actor, a software developed by two CFA faculty members that projects immersive images onto a live performance.
The works of William Shakespeare remain as popular today as when he wrote them in the late 16th century. Their timeless quality is what made one of them the perfect testing ground for use of a novel software in a new multidisciplinary Boston University College of Fine Arts production.
Dream, a 90-minute iteration of Shakespeare's beloved comedy A Midsummer Night's Dream, utilizes Random Actor, a machine-learning tool for live performances. The software, developed here at BU, uses motion capture technology and generative AI to project immersive, interactive images onto the stage, the floor, and even actors' bodies-adding an element of spontaneity to every performance, according to its creators. The production is a collaboration among all three CFA schools: the School of Theatre, the School of Visual Arts, and the School of Music.
Technically speaking, the play, which premieres Friday, October 24, at 7:30 pm, is a CFA faculty research project, says Dream director and Random Actor cocreator Clay Hopper (CFA'05), a CFA senior lecturer.
Hopper and James Grady, a CFA assistant professor of art, graphic design and BU Spark! creative director, first conceptualized an immersive performance tool around 2017. They developed Random Actor and deployed it in a couple of CFA productions subsequently-including its hard launch during 2021's Exit the King-but according to Hopper, it wasn't used to its full capabilities.
After Exit the King, "we worked with developers to build out the software, and we started to become very surprised by what it could do," Hopper says. (The Duan Family Center for Computing & Data Sciences, the Shipley Center for Digital Learning & Innovation, BU Spark! and countless BU students were instrumental in bringing Random Actor to life, he says.) This year, he, Grady, and Jon Savage, a CFA assistant professor, scene design, set out to stage a new show with Random Actor to "prove the software's efficacy inside a narrative context," Hopper says-and thus Dream was, well, dreamed up.
Random Actor is operated in real time by a designer-in Dream's case, Jeremy Cronenberg (CFA'29). The visuals, which can be anything from floating flowers to pulsating muticolored circles to autumn leaves twisting in the wind, adjust along with actors' bodies and voices. They're as much a part of the show as the actors themselves. As Grady explains, Random Actor is a "tool that lives at the intersection of code, motion, and human performance." The software, he says, "introduces an element of chance by generating visuals that influence the actors and the audience; it's almost like having another performer on stage-one you can't predict, but have to respond to. The goal is to explore that tension between structure and spontaneity, where technology doesn't control the performance but adds to the sense of discovery."
An iteration of Shakespeare's A Midsummer Night's Dream, the play, with its magical forest setting and fairy hijinks, makes a perfect backdrop for Random Actor's capabilities.That's why they chose A Midsummer Night's Dream as a test case. First, the play was written centuries before computers were invented-if the software works with the Bard, it can work with anything, Hopper and collaborators reasoned. But the play, with its magical forest setting and fairy hijinks, seemed a perfect vehicle for what Random Actor can do.
Shakespeare's works "already live in that space between order and chaos," Grady says. "A Midsummer Night's Dream is full of transformation and illusion, so it naturally connects with the playful, experimental nature of the software."
Dream also relies on the talents of School of Music personnel. The entire production is scored with improvised acoustic percussion by Gareth Smith, a CFA assistant professor of music, music education.
The sound is yet another interactive element to the play. While Hopper gave Smith sound cues and some specific noise requests, "I have had latitude to play sounds and create textures that are inspiring to me," Smith says. He also plays off Random Actor. The visuals change based on the sounds he produces, so "I get to augment a scene by interacting with a visual at the same time as it morphs, and then the actors respond to me, and I respond to them, and they respond to me responding to Random Actor… It's so cool and such an elevated experience."
And that's the point of this part-play, part-research project, its architects say.
In their eyes, technology has the power to augment the arts-but not replace them. "What we're looking to do is examine how these technologies influence the way we produce art, in the way we consume it, and the way we put it together," Hopper says. "This production is a love letter to classic modes of storytelling, made even more enduring with emerging technology.
"Nothing comes close to the intrinsic value and power that arises from people in a dark room telling a story and listening to it."
Dream premiers Friday, October 24, at 7:30 pm in CFA's "Jewels 1" Juliane Ethel Leilani Miller Studio Theatre (CFA 352), with two additional performances Saturday, at 2 and 7:30 pm, and one performance Sunday at 2 pm. Tickets are free at the door; seating is limited. Find more information about Dream here.
CFA Play Mixes Generative AI and Shakespeare