Your torso is more intuitive - and more precise - than joysticks for piloting drones, both simulated and real, according to a recent study by EPFL scientists. Work is already underway to implement this new body-machine-interface technology for search and rescue with drones.
Imagine piloting a drone using the movements of your torso only and leaving your head free to look around, much like a bird. EPFL research has just shown that using your torso to pilot flying machines is indeed more immersive - and more effective - than using the long-established joystick. The results are published in today’s issue of PNAS.
"Our aim was to design a control method which would be easy to learn and therefore require less mental focus from the users so that they can focus on more important issues, like search and rescue," says lead author Jenifer Miehlbradt of EPFL’s Translational Neuroengineering Laboratory led by Bertarelli Foundation Chair Silvestro Micera. "Using your torso really gives you the feeling that you are actually flying. Joysticks, on the other hand, are of simple design but mastering their use to precisely control distant objects can be challenging."
The scientists wanted to observe how people use their bodies to pilot a flying object, in this case a drone, and determine which movements are most intuitive and natural - approaching the pilot problem from a completely new perspective.
They started by monitoring the body movements of 17 individuals thanks to 19 infrared markers placed all over the upper body as well as their muscular activity. Each participant followed the actions of a virtual drone through simulated landscapes that passed-by as viewed through virtual reality goggles.
Motion patterns emerged and the scientists quickly established torso-related strategies for piloting drones: they found that only 4 markers - located on the torso - were needed to pilot flight simulators and real drones through a circuit of obstacles effectively.
Overall, the scientists compared their torso strategies to joystick control in 39 individuals. They found that torso drone control outperformed joystick control in precision, reliability and with minimal training sessions.
"Data analysis allowed us to develop a very simple and intuitive approach which could also be used with other populations, machines, and operations," says Micera, also at the Scuola Superiore Sant’Anna in Italy in Biomedical Engineering. He adds, "The approach significantly improves the teleoperation of robots with non-human mechanical attributes."
While the PNAS results provide a truly new and completely immersive piloting strategy with a focus on characterizing the relevant torso parameters, leaving the head, limbs, hands and feet free to perform other actions, their proof-of-concept system still requires body markers and external motion detectors in order to work.
The next steps are to make the torso strategy completely wearable for piloting flying objects. The application range is huge, from flight simulators to piloting drones and even perhaps planes of the future. A garment that implements the torso strategy into drone control without external motion detectors was developed at EPFL’s Laboratory of Intelligent Systems led by co-author Dario Floreano based on the PNAS findings.
A jacket for flying like a bird
The Fly Jacket, a first implementation of the torso strategy for piloting flying objects.
Drone pilots can now achieve an unprecedented level of immersion thanks to Fly Jacket developed by Carine Rognon, a researcher at EPFL’s Laboratory of Intelligent Systems led by Dario Floreano. The Fly Jacket appeared in the July 2018 issue of IEEE Robotics and Automation Letters.
Rognon came up with a very lightweight garment that translates the torso strategy described in the PNAS article into drone control commands without external motion detectors. It allows pilots to control their drone intuitively with their body movements - such as leaning forward and backward and pivoting their upper body - rather than with a hand-held device.
Thanks to an embedded motion sensor developed by EPFL Laboratory of Embedded systems led by David Atienza, the fly jacket makes the drone control extremely responsive. Furthermore, the fly jacket has an arm support system that prevents fatigue. This, together with the use of first-person view (FPV) goggles connected to an on-board camera, allows pilots to navigate the skies naturally while keeping their arms spread out as wings.
"The Fly Jacket not only produces an immersive and intuitive flight control experience, but also frees the human hands for other tasks. For example, we have shown that humans could wear data gloves to give additional commands to the drone, like for take-off and landing, or to indicate points of interest seen from the drone perspective that would immediately appear on a map. This could be very useful for fire fighters or rescuers to quickly and precisely identify locations where help is needed," says Floreano.
The research published in PNAS was funded by the National Competence Center in Research in Robotics , the Swiss National Science Foundation , and the Bertarelli Foundation.
PNAS DOI : 10.1073/pnas.1718648115
IEEE DOI : 10.1109/LRA.2018.2810955
Related Job Offers
- 15.02.19 - Professeur-e associé-e (HEPIA ou HEG), responsable de la stratégie digitale de la HES-SO Genève - Genève
- 14.02.19 - Associate Professor or Assistant Professor (tenure track) in the field of computational sciences - Genève
- 14.02.19 - Full Professor or Associate Professor in the field of artificial intelligence and machine learning - Genève
- 12.02.19 - Research & Innovation Advisor/National Contact Point (NCP) for Information and Communication Technologies (ICT), 70-100% - Bern
- 11.02.19 - Leiterin / Leiter des Ressorts Schule & Technik - Chur
- 04.02.19 - Wissenschaftliche/r Mitarbeiter/in (80-100%) - Zug-Rotkreuz
- 19.01.19 - Dozentin / Dozent mit Schwerpunkt Künstliche Intelligenz (70-100%) - Zug-Rotkreuz
- 19.01.19 - Wissenschaftliche Mitarbeiterin / Wissenschaftlicher Mitarbeiter Blockchain (80-100%) - Zug-Rotkreuz
- 11.01.19 - Scientific Programmer in Machine Learning (80-100%) - Zug-Rotkreuz