Exhibition and dialogue platform
Whether private or professional lives; social or economic aspects: Robotics and artificial intelligence have fundamentally changed our lives – and will continue to do so. Which areas will see the greatest impact of these innovative tech-nologies? What new applications will we benefit from in the future?
In addition to concrete applications for work environments, healthcare, mobility, and environment, AI.Society also sheds light on ethical issues in the field of robotics and AI. Because the associated social transformation brings about both opportunities and new challenges. This makes it all the more important to carefully shape this transformation process. Representatives from science, business, politics, and society will jointly develop visions and approaches on how this process can contribute to creating a future worth living. A future that always considers society.
Nothing influences and changes the field of work as profoundly as the development of new technologies and production resources. However, that has now started to change. For example, research experts at the Technical University of Munich (TUM) are now taking a people-centered approach in their vision of the factory of the future. This is not about the replacement or disenfranchisement of humans by technology, but about strengthening their skills, expanding their craftsmanship, and creating a safe working environment.
The interdisciplinary Work@MIRMI network includes more than 20 institutes from the faculties of Mechanical Engineering, Computer Science, Automation, Electrical Engineering, Construction, Geography, Environment, Sports, and Health. It conducts research into challenges such as demographic change, lack of specialists, climate change, and Europe's competitive position on a global scale. One of its lighthouse projects: the KI.FABRIK of the Bavarian High-Tech Agenda, which is to be implemented by 2030 to provide resilient and profitable facilities producing state-of-the-art IT and high-tech mechatronics components. AI.Society offers deep insights into various showcases.
Automated guided vehicles (AGVs) are an integral part of automated intralogistics systems. Even though, AGVs can be used effectively for the transport of standardized load carriers, they are not capable of performing other activities during transport to ultimately add value to the transport process. The aim of the project “Adaptive process planning using mobile pre-assembly systems” is to expand the range of tasks performed by AGVs to include complex handling and pre-assembly processes, which can be performed while the vehicle is in motion. By combining a mobile robot with two collaborative manipulators tasks currently performed manually on mobile platforms can be automated. In turn, stationary work for humans even in flexible production processes is enabled. The AI-based control system is designed to react flexibly to load peaks and errors.
Using technology developed in the Horizon 2020 European project I.AM. (Impact-Aware Manipulation), we show one of the use cases that could benefit from the pursuit of intentional impacts in robotic manipulation tasks. This concerns the rapid grasping of objects using a robotic dual-arm setup, where contact between the end effector and parcel is established with nonzero relative velocity. In our demo, we show how this idea can help decrease the cycle time for robotic depalletization, using two Franka Emika robots with custom silicon flat end effectors, and a set of test parcels with varying weights, up to 2 kilograms
Roundpeg presents the first robot with an integrated human detection system that can work fence-free at speeds that match industrial robots.
When robots learn by observing humans, it is usually not enough to just track the human's motion, but further information about the human's interaction with the environment must be determined. The demo shows how a manipulator can successfully determine dynamic parameters of objects by interacting with them. Mass, center of gravity and inertia of the object are of great importance for a successful execution of the task to be learned.
Network management in AI Factory. We will show how AI can provide deterministic guarantees in network latency in a factory.
Digital twins in the KI.FABRIK are created using engineering data as well as data from the operation. A prototype digital twin for the WITTENSTEIN gearbox was developed to describe an asset electronically in a standardized manner by relying on the Asset Administration Shell. The resulting Digital Twin enables the exchange of asset-related data among industrial assets and between assets and production orchestration systems or engineering tools.
CENTRAL.AI and automatic planning & reconfiguration enable competitive automated manufacturing (Production-as-a-Service) in Bavaria even for batch size 1 for complex products.
This demo aims to approach the problem of deformable objects manipulation in the production process, which currently remains to be handled only manually. Visual and tactile information are provided for detection and tracking, and the wiring task is achieved autonomously through collaboration of two panda robots.
Hence, a powerful Digital Twin provides an innovative way to improve production management and accessibility for human operators.
The mobile robot demo in the AI factory showcases the capabilities of an advanced autonomous robot in a manufacturing environment. The robot is equipped with cutting-edge AI technologies and sensors that enable it to navigate through the factory, identify objects, and perform tasks with high precision and efficiency.
Collective Learning is part of the KI.FABRIK. It learns the skills needed for assembling or fabricating object inside a industrial environment. The robot collective consists of 100 robots that are learning a wide range of insertion tasks. After an adequat number of tasks are learned, the system has enouth knowledge to imideatly solve a new previously unknown task. Collective Learning provides the lowlevel parametrisation of skills needed by the Central AI of KI.FABRIK to produce new products.
We leverage state-of-the-art generative Artificial Intelligence (AI) algorithms and Additive Manufacturing (AM) to optimize product design for small production sizes. Our program enables seamless interaction between human designers and AI, facilitating a natural and intuitive design process. By iteratively selecting AI-generated design suggestions that meet specific criteria, designers can guide the AI through the generation process. The use of AM allows us to print and immediately deploy the generated design. This process is fast, cost-effective and highly adjustable, making it ideal for the development of customized products or single-use equipment.
With the Robosphere, Hydrabyte presents a system for optimal near-field movement planning of manipulator groups. Hydrabayte's unique multi-robot technology allows multiple assembly processes to be carried out in parallel in the smallest of spaces. Together with Olive Robotics GmbH, the assembly of robotic systems using the modular robotic components from Olive Robotics will be demonstrated at the booth.
Mimetik develops solutions for human-centric automation and robot/machine control. The know-how of Mimetik is an IIoT sensor of human activity. It is based on AI-powered industrial-quality data gloves that track hand/finger movement and extract an intelligence from the motion data. Proprietary AI achieves high accuracy and quick deployment. With a focus on industrial applications, Mimetik improves the training of workers at assembly lines, optimises efficiency and ergonomics of workplaces, and improves automated quality control.
The demo is aimed to provide an overview of how the Meshmerize network can help to quickly provide network coverage in the areas that pose a challenge for the cellular networks at the moment. We are excited to show in our scenario just how easy it is to deploy a reliable network and maintain strong connectivity to greatly benefit firefighting efforts. Not being dependent on cellular network infrastructure means it can be used in a remote forest, as well as in a buzzing city. UAVs and mesh can give firefighters an extra pair of eyes to help fight the flames, ensuring safe site inspection, prioritized and strategic disaster response, efficient search and rescue operations, information sharing, and keeping track of team members locations.
Mobility moves us all – quite literally – including robotics and AI.Society. Sustainable and flexible mobility solutions will shape our future. Autonomous vehicles, interconnected mobility, flying robots, dynamics modeling, and collision detection are essential components of mobile robotics. In addition, AI makes it easier for us to access new forms of mobility and helps us eliminate human error.
driveblocks showcases its software technology for mapless autonomy in the commercial vehicle sector (e.g. highway trucking, container terminals, or mining). This new approach overcomes the challenges with scaling and robustness in many applications today. The product is packaged in a modular way can be licensed as individual software modules or full stack autonomous driving solutions. älter: Die Zukunft des autonomen Fahrens. Eine modulare, skalierbare, robuste und sichere Plattform Modulare Software for autonomous driving, esp. in use in heavy vehicles. Presentation of software modules
Autonomous driving is a megatrend in the industry. However, we haven’t seen the big breakthrough of autonomous cars on public streets yet. Even though many algorithms exist for individual subtasks of autonomous driving, such as image recognition. Setting up optimal, robust overall software remains a significant challenge. Our research targets this aspect: We want to create a fully autonomous software stack capable of running on public roads. Therefore, we have an autonomous research vehicle equipped with extensive sensor technology and high-performance computers to deploy and test the developed software.
"Our demo showcases the DARKO concept of a mobile manipulator that efficiently and safely carries out intralogistics tasks in agile production in an industrial environment shared with humans. The robot picks up objects from a shelf and moves them while utilizing an anthropomorphic mockup driver to interact naturally with co-workers. It then throws the objects into boxes on a conveyor belt."
We present our innovative electric motor design for use in the automotive sector. Without the use of any magnets, our motor exceeds current motor technology, which requires rare earth magnets. Not needing these magnets makes the motor cheaper and more sustainable.
Aerial Robotics has made enormous advances in the last years. We will present latest results of an unconventional aerial robotic systems leveraging compliant elements in the mechanical design to perform novel hybrid ground-aerial locomotion and manipulation tasks. Beyond that we will present a student drone racing initiative for a novel class of large unmanned aerial systems.
Many robots previously used in industrial settings are now being adapted to meet the more demanding challenges of the healthcare sector. How come? It is a reaction to an aging population and the shortage of healthcare professionals. Computer vision, machine learning, virtual and augmented reality, among other things, will take robotics to an entirely new level, thus enabling them to become true members of staff.
No one likes to spill a coffee — neither do robots. At this demo, we want to demonstrate that robots can transport and manipulate liquids with safety guarantees. That is, we present robots with spill-free capabilities! We would also like to invite all participants to interact with and telecontrol our robotic arms safely transporting your favourite drink. Would you allow a robot to carry your favourite beverage? Visit our booth and our robots at Automatica Munich_i 2023!
Real-time visualization of muscle efforts during a series of movements in a human musculoskeletal model.
For ubiquitous use in modern households, robots need to learn new skills, represent what are the skill's relevant features, and adapt the skill to their own robot kinematics. The goal being to execute a skill with (other) objects in a new environment with different constraints or geometry than the environment in which the skill was learned. We are proposing a two-phase system for first learning and consequently adapting skills. In the learning phase, a model- and knowledge-based approach is compared to an experience-based approach by their representational qualities. And in the adaptation phase, a robot executes one of the learned tasks (i.e. skill sequence) on its own.
We are presenting new skills of GARMI that have been developed since our presentation last year. In addition to enhanced telemedicine skills, we will show how GARMI can function as a teleoperated avatar using various input devices, such as our self-developed exoskeleton, Omega haptic controller and a full VR environment. We will also demonstrate some of the autonomous service features of GARMI.
Exoskeleton to remotely control the humanoid avatar GARMI and interact with patients.
The Telemedicine demo will show how service humanoid robot GARMI can be operated in using a haptic input device (Omega.7) or HTC vive controller.
Hands play an important role in human life for prehensile, proprioceptive, and communication purposes. Compensating for losing fine and coordinated function of upper extremities with prostheses is a medical, technological, psychological, and social challenge. Even though artificial limbs open up the prospect of restoring some missing capabilities, there is still a wide gap between available commercial devices and the perceived demands of prosthesis users. This booth includes novel solutions for more intelligent neuroprostheses, both at their structure and in the control loop. We will present demos of commercially-available hands, explorative and bio-inspired designs through soft robotics, and different control modalities.
The research project AURORA aims at developing a mobile service robot for the OR wing. This „robotic circulating nurse“ moves freely within non-sterile OR areas and autonomously supports the surgical team with context-dependent tasks, such as the provision of sterile goods or the adjustment of medical devices. Thereby, today‘s severe shortage of nurses can be alleviated. The system is also capable of collaborating with other robots, which includes sterile handover interactions with the instrumentation robot SASHA-OR.
In the SASHA-OR research project, an intelligent assistance robot is being developed for the surgical operating room, designed for sterile use at the operating table. The system supports the surgical team with a flexible instrument and object management. By incorporating tactile sensor technology and computer vision methods, the robot can verbally interact with the surgical team, recognize, hand over and receive requested laparoscopic instruments and sterile goods. In addition to robot-human interaction, the system also aims at robot-robot interaction to enable sterile handover actions with the robotic circulating nurse AURORA.
We are excited to present our research on some of the latest advancements in catheter tracking technologies and their potential impact on future minimally invasive endovascular procedures. These technologies utilize advanced algorithms and a variety of imaging and sensing techniques to provide real-time guidance during catheter placement. The integration of these technologies has the potential to enhance the accuracy and efficiency of medical procedures, even in complex anatomical structures. By reducing errors and improving patient outcomes, catheter tracking technologies represent an exciting area of growth in medical technology. Their incorporation into medical interventions underscores the potential for artificial intelligence to transform the field of interventional medicine and improve patient care.
Patients in the intensive care unit require constant monitoring to prevent untoward incidents such as falling down and removal of catheters. However, due to the shortage of healthcare workers and an ever-increasing amount of paperwork, it is not possible to have a healthcare worker by a patient’s side all the time. To overcome this problem, we propose to use cameras to monitor patients by tracking their 3D skeletons. The actions of the patients can then be recognized using the tracked skeletons of the patients. With the information, a system can be built to notify healthcare workers if something has happened to a patient.
In mirror therapy, a mirror is placed into body-midline. Looking into the mirror, the reflection of the unimpaired hand is seen at the position of the non-seen impaired hand behind this mirror. Observing a seemingly normal moving impaired limb has a strong stimulating effect on the motor system and has been shown to be highly effective for treatment in stroke. You will be able to perceive this effect in the VR demo when movements of your virtual hand are controlled by real movements of your opposite hand. During bilateral or unilateral games you will become confident with using your avatar hand.
Climate and environmental protection are among the most pressing issues of our time. How can AI and smart robots support coordinated efforts to tackle this major global challenge? Various applications are already being used in sustainable agriculture, environmental protection, and air quality measurement.
90% of trade is carried by sea, but the vast majority of cargo ships in use today run on the dirtiest fuel available. CargoKite addresses this problem with an innovative, patent-pending, autonomous micro cargo ship powered solely by wind energy. Robotics and automation is at the core of this 21st century version of a sailing ship. At Automatica, you will get to experience a small-scale model of the ship.
The Professorship of Environmental Sensing and Modeling employs advanced sensor models in conjunction with machine learning to enhance the accuracy of environmental sensors, particularly for measuring air quality, carbon and methane emissions. In addition, by extracting valuable insights from large datasets, we aim to predict environmental parameters at locations without measurement. This is a vital task in the context of climate change, where rough bottom-up estimates still lack validation from accurate measurements and modeling. The combination of advanced sensors, atmospheric modeling and AI creates a powerful tool which informs decision-making regarding environmental policy and management.
This study aims to investigate the use of sensors like cameras, multispectral cameras, and thermal cameras to detect if the ground in front of autonomous agricultural field robots is traversable. A comprehensive dataset will be collected using a robotic platform in challenging, natural environments, and hand-crafted difficult terrains. The experiments will be conducted multiple times, with variations in environmental conditions and throughout different seasons. Hand-crafted terrains will consist of challenges like soil with high moisture levels, large furrows, frozen ground, and agricultural residues.
The i_space forum looks beyond technical feasibility aspects of robotics and automation. Carefully selected demo teams provide exciting insights into their current research and real-life scenarios. Program highlights include the Ethics Round Table and the KI.FABRIK Bavaria panel discussion.