munich_i Session 3: Beyond human spaces

Premiere 2021

To final report

A look into the future of robotics.

The term robot suggests an image of machines working around the clock in factories. This notion will soon change drastically – because the future looks different: Biology-inspired robots whose appearance resembles trunks, worms or snakes will enable keyhole operations, transportation robots will learn to fly, and avatar robots will take on delicate tasks in emergency situations.

Five international experts will present their ideas and concepts for the robotics of tomorrow and beyond:

Prof. Antonio Bicchi, University of Pisa
Prof. Dr. Davide Scaramuzza, University of Zurich
Prof. Dr.-Ing. Jessica Burgner-Kahrs, University of Toronto
Prof. Dr. Sethu Vijayakumar, University of Edinburgh
Prof. Dr. Roland Siegwart, ETH Zürich
Session Chair

Most robotics applications are in the early stages of their evolution. So far, robotics has primarily dominated automation in production. Robots set welding spots in the automotive sector or place semiconductor elements on circuit boards – with short cycle times and high precision. But the next generation of robots is already waiting in the wings: And this new species of robots will take on tasks that are currently far beyond our imagination.

That is the vision currently being shaped: Autonomous micro drones such as quadcopters navigating three dimensions, elastic micro bots moving inside the human body and revolutionizing keyhole surgery. Heavy duty robots will operate beyond human spaces in deep sea and mining environments – without an operator controlling them. Artificial intelligence will enable higher levels of autonomy, and emergency responders such as the fire brigade will use humanoid robots as their go-to response in rescue situations.

Join our international experts on a truly inspiring journey into the future of robotics!

The Speakers of the Session

“From Robots in Emergency to Robots in Everyday Life”

The health emergency caused by the Covid-19 pandemic has brought the urgent need to carry out work safely even in environments that, previously familiar, have suddenly become potentially hostile. The need first arose in hospitals and retirement homes, where healthcare personnel were exposed to direct infection by patients, but in the later stages it also became evident in the production, logistics and trade in material goods. The growing need for security and distance can be given an incisive response by extending the paradigm of smart working to those jobs that require physical action on people and the environment. Robotic research will relieve people from the risks and fatigue of work, while leaving them able to leverage their professionalism and perform physical smart working effectively. Collaborative robots and robotic avatars, semi-autonomous intelligent machines that can be sent to dangerous environments to perform highly dexterous tasks, will be able to transfer the irreplaceable skills of operators specialized in critical scenarios, avoiding their dangers and physical fatigue. This will also contribute to the equalization of tasks and potential between people of different physical structures. Fortunately, recent research advancements in the field of robotics have made it possible not only to have machines that approach or beat the computational intelligence of humans, but are also capable of ever more natural motion and exploit the "physical" intelligence embodied in their structure.

Antonio Bicchi is a scientist interested in robotics and intelligent machines. After graduating in Pisa and receiving a Ph.D. from the University of Bologna, he spent a few years at the MIT AI Lab of Cambridge before taking the first chair in Robotics at the University of Pisa. In 2009 he founded the Soft Robotics Laboratory at the Italian Institute of Technology a Genova, which he still leads. Since 2013 he is Adjunct Professor at Arizona State University, Tempe, AZ. His work has been recognized with many international awards and has earned him four prestigious grants from the European Research Council (ERC). He launched initiatives such as the WorldHaptics conference series (the major conference on natural and artificial touch), the IEEE Robotics and Automation Letters (the largest Journal in the field), and the Italian Institute of Robotics and Intelligent Machines.

“Autonomous, Agile Micro Drones”

Autonomous micro drones, such as quadcopters, will soon play a major role in search-and-rescue and remote-inspection missions, where a fast response is crucial. Quadcopters have the potential to navigate quickly through complex environments, enter and exit buildings through narrow gaps, and fly through partially collapsed buildings. However, their speed and maneuverability are still far from those of birds and human pilots. Human pilots take years to learn the skills to navigate drones. Autonomous agile drone navigation through unknown, indoor environments poses a number of challenges for robotics research in terms of perception, state estimation, planning, and control. In this talk, I will show how the combination of both model-based and machine learning methods united with the power of new, low-latency sensors, such as event-based cameras, allow drones to achieve unprecedented speed and robustness by relying solely on the use of passive cameras, inertial sensors, and onboard computing.

Davide Scaramuzza is Professor of Robotics and Perception at both departments of Informatics and Neuroinformatics (University of Zurich and ETH Zurich), where he does research at the intersection of robotics, computer vision, and neuroscience. Specifically he investigates the use of standard and neuromorphic cameras to enable autonomous, agile, navigation of micro drones in search-and-rescue scenarios. He did his PhD in robotics and computer vision at ETH Zurich and a postdoc at the University of Pennsylvania. He led the European project sFly, which introduced the PX4 autopilot and pioneered visual-SLAM-based autonomous navigation of micro drones (2009-2012). For his research contributions in vision-based navigation, he was awarded the IEEE Robotics and Automation Society Early Career Award, the SNSF-ERC Starting Grant, a Google Research Award, KUKA, Qualcomm, and Intel awards, the European Young Research Award, the Misha Mahowald Neuromorphic Engineering Award, and several conference paper awards. He coauthored the book "Introduction to Autonomous Mobile Robots" and more than 100 papers on robotics and perception published in top-ranked journals and conferences. In 2015, he cofounded a venture, called Zurich-Eye, dedicated to the commercialization of visual-inertial navigation solutions for mobile robots, which later became Facebook-Oculus Switzerland and Oculus' European research hub. He was also the strategic advisor of Dacuda, which later became Magic Leap Zurich.

“Rethinking Robotics – Worms, tongues, and elephant trunks”

Continuum robots, which are biologically inspired and organic compliant structures, differ fundamentally from traditional robots, which rely on a rigid joint-link composition. Their appearance is evocative of animals and organs such as trunks, tongues, worms, and snakes. Composed of flexible, elastic, or soft materials, continuum robots can perform complex bending motions and appear with curvilinear shapes. Thanks to their small size and high dexterity, continuum robots have the potential to revolutionize keyhole access for instance surgery through small incisions and natural orifices or maintenance, repair, and operations. This allows to rethink current approaches as deep-seated sites become accessible on highly tortuous paths and unimagined manoevres become feasible.

Jessica Burgner-Kahrs is an Associate Professor with the Departments for Mathematical & Computational Sciences, Computer Science, and Mechanical & Industrial Engineering and founding Director of the Continuum Robotics Laboratory at the University of Toronto, Canada. She holds appointments From 2013 to 2019 she was with Leibniz University Hannover, Germany and from 2010 to 2012 with Vanderbilt University, USA. She received her Diploma and Ph.D. in Computer Science from Karlsruhe Institute of Technology (KIT), Germany in 2006 and 2010 respectively.

Her research focus lies on continuum robotics and in particular on their design, modeling, planning and control, as well as human-robot interaction. Her fundamental robotics research is driven by applications in minimally-invasive surgery and maintenance, repair, and operations. In 2015, her research was recognized with the Heinz Maier-Leibnitz Prize, the Lower Saxony Science Award in the category Young Researcher, and she was entitled Young Researcher of the Year 2015 in Germany. The Berlin-Brandenburg Academy of Sciences awarded her the Engineering Science Prize in 2016. She was elected as one of the Top 40 under 40 in the category Science and Society in 2015, 2016, and 2017 by the business magazine Capital. In 2019, Jessica was nominated as Young Global Leader from the World Economic Forum.

“Shared Autonomy: The Future of Interactive Robotics”

The next generation of robots are going to work much more closely with humans, other robots and interact significantly with the environment around it. As a result, the key paradigms are shifting from isolated decision making systems to one that involves shared control -- with significant autonomy devolved to the robot platform; and end-users in the loop making only high level decisions. This talk will briefly introduce powerful machine learning technologies that are enabling us to reap the benefits of increased autonomy while still feeling securely in control. This also raises some fundamental questions: what is the optimal trade-off between autonomy and control that we are comfortable with? Domains where this debate is relevant include unmanned space exploration, self-driving cars, offshore asset inspection & maintenance, deep sea & autonomous mining, shared manufacturing, exoskeletons/prosthetics for rehabilitation as well as smart cities to list a few.

Sethu Vijayakumar is the Professor of Robotics at the University of Edinburgh, UK and the Director of the Edinburgh Centre for Robotics. He has pioneered the use of large scale machine learning techniques in the real-time control of several iconic robotic platforms such as the SARCOS and the HONDA ASIMO humanoids, KUKA-LWR robot arm and iLIMB prosthetic hand. His latest project (2016) involves a collaboration with NASA Johnson Space Centre on the Valkyrie humanoid robot being prepared for unmanned robotic pre-deployment missions to Mars. He is a Fellow of the Royal Society of Edinburgh, a judge on BBC Robot Wars and winner of the 2015 Tam Dalyell Prize for excellence in engaging the public with science. Professor Vijayakumar helps shape and drive the national Robotics and Autonomous Systems (RAS) agenda in his recent role as the Programme co-Director for Artificial Intelligence (AI) at The Alan Turing Institute, the UK’s national institute for data science and AI.

“Pioneering Flying Robots”

For fast search & rescue or inspection of complex environments, flying robots are probably the most efficient and versatile devices. However, the limited flight time and payload, as well as the restricted computing power of drones renders autonomous operations quite challenging. This talk will focus on the design and autonomous navigation of flying robots. Innovative designs of flying systems, from novel concepts of omni-directional multi-copters and blimps to solar airplanes for continuous flights are presented. Recent results of visual and laser based navigation (localization, mapping, planning) in GPS denied environments are showcased and discussed. Performance and potential applications are presented.

Roland Siegwart (1959) is professor for autonomous mobile robots at ETH Zurich, founding co-director of the technology transfer center Wyss Zurich and board member of multiple high tech companies. He studied mechanical engineering at ETH Zurich, spent ten years as professor at EPFL Lausanne (1996 – 2006), held visiting positions at Stanford University and NASA Ames and was vice president of ETH Zurich (2010 -2014). He is IEEE Fellow and recipient of the IEEE RAS Pioneer Award and IEEE RAS Inaba Technical Award. He is among the most cited scientist in robots world-wide, co-founder of more than half a dozen spin-off companies and a strong promoter of innovation and entrepreneurship in Switzerland.

His interests are in the design, control and navigation of flying, wheeled and walking robots operating in complex and highly dynamical environments.

Session Chair

“Beyond human spaces” will be moderated by Prof. Dr. Alin Albu-Schäffer as Session Chair.

Robotics today become ubiquitous in everyday human environments, outdoors, or even on remote planets. Accordingly, mobile robots became an established technology and come in different morphologies, according to the huge variety of application environments. From biped humanoid robots over multi-legged animaloids to rovers, drones, and flying robots with manipulation capabilities – robotics diversified and specialized in recent years. We will address this current robotics trend in a dedicated session at the munich_i Hightech Summit, bringing together leading researchers and fostering the interaction between research and industry in this exiting emerging field.

munich_i Session 4. Building intelligence

The Building Intelligence session will provide spectacular insights into AI research – with scientists providing first-hand information with a focus...

munich_i Session 5: Interaction of humans and robots

Humanoid robots are becoming ever more capable as a result of tremendous advanc-es in AI, sensor technology and human motion data science.

munich_i CEO Round Table on June 23, 2021, 13:00 CEST

The decision makers of international robotics look into the future of AI and robotics and give an inspiring outlook.