Skip to content

Take a peek inside a flickering candle flame with these 3D printed shapes

    New research from MIT explores fire from a whole host of new perspectives. The research uses deep learning approaches that extract the vibrational characteristics of flames as flickering objects and convert them into sounds and materials.

    The 19th-century physicist Michael Faraday was known not only for his pioneering experimental contributions to electromagnetism, but also for his public speaking. His annual Christmas lectures at the Royal Institution evolved into a holiday tradition that continues to this day. One of his most famous Christmas lectures was on the chemical history of a candle. Faraday illustrated his points with a simple experiment: he placed a candle in a lamp glass to shut out any breezes and get “a still flame.” Faraday then showed how the flame’s shape flickered and changed in response to disturbances.

    “You shouldn’t imagine, seeing these tongues all at once, that the flame has this particular shape,” Faraday noted. “A flame of that shape is never like that. Never is a flame body, like the one you just saw rising from the ball, of the shape it seems to you. It consists of a multitude of different shapes, one after the other, one after the other. so fast that the eye can only perceive them all at once.”

    Now MIT researchers have brought Faraday’s simple experiment into the 21st century. Markus Buehler and his postdoc, Mario Milazzo, combined high-resolution imaging with deep machine learning to sonify a single candle flame. They then used that one flame as a basic building block, creating ‘music’ from its flickering dynamics and designing new structures that can be 3D printed into physical objects. Buehler described this and other related work at the American Physical Society meeting in Chicago last week.

    The dynamics of a flickering candle flame.  Researchers are using deep learning to first examine what the vibration of a single flame sounds like and then generalize the approach to a larger fire that creates a variety of sounds.
    enlarge The dynamics of a flickering candle flame. Researchers are using deep learning to first examine what the vibration of a single flame sounds like and then generalize the approach to a larger fire that creates a variety of sounds.

    MIT

    As we reported before, Buehler specializes in developing AI models to design new proteins. He is perhaps best known for using sonification to illuminate structural details that would otherwise be elusive. Buehler discovered that the hierarchical elements of music composition (pitch, range, dynamics, tempo) are analogous to the hierarchical elements of protein structure. Just as music has a limited number of notes and chords and uses different combinations to compose music, proteins have a limited number of building blocks (20 amino acids) that can be combined in a variety of ways to create new protein structures with unique properties. Each amino acid has a specific sound signature, similar to a fingerprint.

    Several years ago, Buehler led a team of MIT scientists who mapped the molecular structure of proteins in spider silk threads to music theory to produce the “sound” of silk. The hope was to find a radically new way to make designer proteins. That work inspired an exhibition of sonification art, “Spider’s Canvas,” in Paris in 2018. Artist Tomas Saraceno collaborated with MIT engineers to create an interactive harp-like instrument inspired by the web of a Cyrtophora citricola spin, where each strand in the “web” is tuned to a different pitch. Combine those notes in different patterns in the 3D fabric of the web and you can generate melodies.

    In 2019, Buehler’s team developed an even more sophisticated system for making music based on a protein structure – and then converting the music again to create new proteins that do not occur in nature. The goal was to learn how to make similar synthetic cobwebs and other structures that mimic the process of the spider. And in 2020, Buehler’s team applied the same approach to model the vibrational properties of the spike protein responsible for the high infection rate of the novel coronavirus (SARS-CoV-2).

    Machine-learning-rendered image of a flame and its 3D-printed fabrication.
    enlarge Machine-learning-rendered image of a flame and its 3D-printed fabrication.

    Markus Buehler

    Buehler wondered if this approach could be extended enough to study fire. “Flames are naturally silent,” he said at a press conference. “However, fire has all the elements of a vibrating string or vibrating molecule, but in a dynamic pattern that is interesting. If we could hear them, what would they sound like? Can we materialize fire? Can we push the boundaries to create bio-inspired materials that you could really feel and touch as a result?”

    Like Faraday centuries ago, Buehler and Milazzo started with a simple experiment with a single candle flame. (A larger fire will have so many disturbances that it becomes too difficult to calculate, but a single flame can be thought of as a fundamental building block of fire.) The researchers lit a candle in a controlled environment, with no air movement or other external cues — the silent flame of Faraday. They then played sounds from a speaker and used a high-speed camera to record how the flame flickered and distorted over time in response to those acoustic signals.

    Montage simulation of flames to a princess in a fairy garden.

    Montage simulation of flames to a princess in a fairy garden.

    Markus Buehler and Mario Milazzo, MIT

    “This creates characteristic shapes, but they are not the same shapes every time,” Buehler said. “This is a dynamic process, so what you see [in our images] is just a snapshot of this. In reality, there are thousands upon thousands of images for every expectation of the acoustic signal – a circle of fire.”

    He and Milazzo then trained a neural network to classify the original audio signals that created a particular flame shape. The researchers effectively sonicated the vibrational frequencies of fire. The more violently a flame deflects, the more dramatically the audio signal changes. The flame becomes a kind of musical instrument, which we can ‘play’, for example by exposing it to air currents, to make the flame flicker in certain ways – a form of musical composition.

    “Fire is vibrating, rhythmic and repetitive and constantly changing, and this is what defines music,” Buehler said. “Deep learning helps us to mine the data and certain patterns of fire, and with different patterns in fire you can create this orchestra of different sounds.”

    Buehler and Milazzo have also used the different shapes of flickering flames as building blocks to design new structures on the computer and then 3D print those structures. “It’s a bit like freezing the flame of a fire in time and looking at it from different angles,” Buehler said. “You can touch it, turn it, and the other thing you can do is look into the flames, something that no human has ever seen.”