Researchers have made a smart school of robotic fish that swarm and swim just like the real deal, and they offer promising insights into how developers can improve decentralised, autonomous operations for other gizmos like self-driving vehicles and robotic space explorers. Also, they’re just pretty stinking cute.
These seven 3D-printed robots, or Bluebots, can synchronise their movements to swim in a group, or Blueswarm, without any outside control, per research published in Science Robotics this month from the Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering.
Equipped with two wide-angle cameras for eyes, each bot navigates their tank by tracking the LEDs lights on their peers. Based on the cues they observe, each robot reacts accordingly using an onboard Raspberry Pi computer and custom algorithm to gauge distance, direction, and heading.
“Each Bluebot implicitly reacts to its neighbours’ positions,” explains Florian Berlinger, a PhD candidate at SEAS and Wyss and first author of the research paper, per a press release. “So, if we want the robots to aggregate, then each Bluebot will calculate the position of each of its neighbours and move towards the centre. If we want the robots to disperse, the Bluebots do the opposite. If we want them to swim as a school in a circle, they are programmed to follow lights directly in front of them in a clockwise direction.”
Previous robotic swarms could navigate in two-dimensional spaces, but operating in three-dimensional spaces like air or water has proven tricky. The goal of this research was to create a robofish swarm that could move in sync all on their own without the need for WiFi or GPS and without input from their human handlers.
“Robots are often deployed in areas that are inaccessible or dangerous to humans, areas where human intervention might not even be possible,” Berlinger said. “In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient.”
That’s why these robots can interpret where their peers are based solely on visual cues. In one test, researchers sent them on a simulated search mission to find and surround a red light placed in the tank. The robots first spread out using their dispersion algorithm. Once one spotted the light and approached it, its own LEDs began to flash to signal its companions to change their goal to swarming the blinking light.
And the ability to teach robots to cooperate more effectively has huge potential beyond these little robofishes, cute as they may be. As Wired points out, if this sort of implicit coordination can be applied to self-driving cars, it could potentially help them avoid collisions. This research also holds potential for businesses like Amazon that deploy robots in their warehouses to work alongside their human coworkers.
“This is more fantasy than reality for now, but think about going to Mars, if Elon Musk and all the other rich guys really want to pull that off,” Berlinger told the outlet. Before SpaceX can start ferrying humans over to the Red Planet, they’ll need shelters set up there. “So you would have to send robot teams beforehand. And on Mars, there’s no way to control the robots, because there’s too much latency for a signal to go from here to Mars. So they really need a high degree of autonomy.”