Scientists attached a mushroom to a 'biohybrid robot body' and the results were amazing
09-05-2025

Scientists attached a mushroom to a 'biohybrid robot body' and the results were amazing

subscribe
facebooklinkedinxwhatsappbluesky

Robots usually take orders from silicon chips and lines of code. In this project, the commands start in living tissue. The control signals come from mycelia – the branching underground networks that support mushrooms.

These networks sense light, touch, chemicals, and temperature. They also produce tiny electrical pulses that carry information.

Researchers from Cornell University Engineering learned to read those pulses without harming the organism or muddying the signal. They translated the rhythms into digital commands that set a robot’s movement.

When the environment changed, the mycelia changed their electrical patterns, and the robot adjusted how it moved. That is sensorimotor control driven by a living system.

From mushrooms to motion

The work comes from Cornell Engineering. After growing mycelium in small chambers fitted with electrodes, the team linked those chambers to two robots and let the fungus set the tempo for movement.

“By growing mycelium into the electronics of a robot, we were able to allow the biohybrid machine to sense and respond to the environment. In this case we used light as the input, but in the future it will be chemical,” explained Rob Shepherd, professor of mechanical and aerospace engineering in Cornell Engineering, and the paper’s senior author.

Melding mycelium with a robot

Many biohybrid systems use animal cells such as muscle fibers. Those cells can work in controlled lab setups but demand constant care and strict conditions.

Mycelia handle rougher settings. They tolerate swings in humidity and temperature, grow on basic nutrients, and respond to many cues.

That toughness makes them appealing for robots meant to operate outdoors or in messy places where dust, vibration, and weather get in the way.

Mycelia also form dense, responsive networks. The electrical activity runs in spikes and patterns rather than steady tones, which is handy for timing movement. That natural rhythm lets engineers detect meaningful signals without decoding every subtle variation.

Making whisper-quiet signals

The main hurdle was signal quality. Electrical noise from motors, wires, and the air itself can drown out small biological spikes. The team built a shielded interface that blocks vibrations and electromagnetic interference.

That design allowed clean recordings while the robots moved, which is rarely possible with living sensors because motion tends to scramble delicate measurements.

Software then monitored the incoming signal in real time. It detected spike times and translated them into control commands.

The approach focused on rhythm, not on decoding a complex language of signals, which kept the system robust.

Robot gait from mycelium rhythm

Biology has a term – central pattern generator (CPG) – for circuits that produce rhythmic outputs to coordinate movement. The controller here borrowed that idea. The mycelial spikes set a beat.

The controller mapped that beat to motor timing so the robot could coordinate steps or wheel turns without micromanaging each joint.

Focusing on timing helps avoid overfitting to noisy features. If the rhythm quickened or changed shape under new conditions, the controller shifted the gait accordingly. That kept the loop simple and reliable.

This biohybrid robot uses electrical signals processed from King Oyster mushroom mycelia to jump and move around. Credit: Cornell University
This biohybrid robot uses electrical signals processed from King Oyster mushroom mycelia to jump and move around. Click image for video. Credit: Cornell University

Two robots, one living beat

To show that it works across designs, the team built a soft, spider-like walker and a small wheeled rover.

Each held a living mycelium sample wired to electrodes. When the fungus produced its natural spiking pattern, the robots moved.

The mycelia sensed ultraviolet light. When UV hit the chamber, the spiking pattern changed. The robots then changed how they moved by altering gait timing.

The researchers also added an override mode through the same interface. They could pause or redirect the robot with an injected signal without damaging the tissue, combining biological autonomy with human supervision.

Real-world applications

Soil chemistry changes over hours, days, and seasons, and the cost of sampling across large fields adds up.

Systems that respond locally to nutrient cues could help apply fertilizer where it’s needed and cut back where it’s not.

“The potential for future robots could be to sense soil chemistry in row crops and decide when to add more fertilizer, for example, perhaps mitigating downstream effects of agriculture like harmful algal blooms,” Shepherd expounded.

Environmental monitoring faces similar trade-offs. Sensors fail in wet, dirty, or variable conditions. Mycelia evolved to survive in those conditions and react to them.

Tapping that sensitivity may give cheap, soft robots a way to notice toxins or pathogens and adjust their paths without hard-coded rules for every scenario.

Robotic mycelium beyond light

Light is just one input. Mycelia excel as chemical sensors, reacting to nutrients, toxins, and microbial signals.

Moving from light to chemical cues would open direct links to soil health, pollutant detection, and other real-world triggers.

A controller can listen for shifts in rhythm as the fungus responds to a molecule and translate that into a change in movement or task.

That shift also aligns with how fields and waterways behave. Chemistry changes in patches and pulses. A living sensor that tracks those changes in place can help robots act locally without heavy computation.

Why mycelium robots matter

Typical robots bolt on their different sensors as if they were simple accessories. This approach blends the sensor into the control loop itself.

The living network doesn’t just report data; it shapes timing and action. That’s a different philosophy for machine control.

“This kind of project is not just about controlling a robot. It is also about creating a true connection with the living system. Because once you hear the signal, you also understand what’s going on,” noted lead author Anand Mishra, a research associate in the Organic Robotics Lab.

“Maybe that signal is coming from some kind of stresses. So you’re seeing the physical response, because those signals we can’t visualize, but the robot is making a visualization,” Mishra concluded.

There’s still work between lab demos and long-term field deployments. Keeping the biology stable, scaling production, and validating responses across varied environments are real jobs ahead.

But the key step is now clear: a living network can steer a machine outside a benchtop, and the machine can respond in kind without the setup falling apart.

Click here to see a video of the study from Cornell University…

The full study was published in the journal Science Robotics.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe