Project E-Motion digitizes motion sequences

“Already the first part of E-Motion would be a great success, once it achieves production maturity, because it supports the MTM engineers significantly in their analysis work. But it gets really exciting, when an avatar illustrates ergonomically optimal movements in an VR environment. ” Gerald Müller, Vice President Process and Efficiency DB Schenker

AI helps to code movements

Together with the Fraunhofer IML and the start-up MotionMiners, DB Schenker is working on a prototype that makes it possible to analyze motion sequences recorded by motion sensors using MTM.

In principle, movements of process engineers have been analyzed for many decades, translated into a kind of readable code and thus described exactly. However, MTM (Methods-Time Measurement, see the information in the box below) or the lesser-known WF (Work Factor) are time-consuming procedures to analyze workflows and then examine their ergonomics and efficiency. Especially in the industry, this method is used for planning new assembly lines, but also in logistics for the calculation of services. Motion mining, meaning the targeted acquisition and evaluation of movement data, is supposed to accelerate this analysis process and support the MTM engineer in his work through self-learning algorithms.

First, data on the motion sequence must be recorded digitally. For this purpose, employees wear sensors on the body. The recorded process is first transferred to the MotionMiners tool and analyzed. When analyzing a group of people, this can also be used to make them anonymous. In conjunction with DB Schenker’s own MTM database, the motion mining technology can then (partially) automatically link the movement data with process components based on MTM and thus evaluate movement sequences in a timely manner.

Sometimes the assistance system developed by the Fraunhofer Institute presents to the MTM engineer sections of movement with corresponding images. For example, if the data is not clear, or if weight parameters and distances must be added. But even then the platform makes a pre-selection, so that the MTM engineer can select from it. At the same time, self-learning algorithms are to be trained in future via the feedback of the MTM engineer so that they can independently analyze more and more ambiguous movements with a sufficiently large training data base. Intensive testing has shown that Schenker is taking the right approach with this idea, but further development is needed to bring the prototype to production maturity.

“#DBSchenker and #VR: An #avatar will teach #ergonomics in the warehouse.“

Tweet WhatsApp

“When it’s ready for series production, this subsection in the E-Motion project alone will offer tremendous process optimization for our MTM engineers, who can then code and analyze significantly faster motion sequences with the combination of MotionMiners tool and assistance system”, says Gerald Müller. “But it will get even more exciting when we use this MTM data to program avatars and check their movements for ergonomics. This way avatars can be created with ergonomically optimized motion sequences. At the same time, this avatar can show employees how a movement process is optimally designed, because people do not learn movement sequences from an MTM process report, but through imitation. “

Colleague ‘Avatar’ is supposed to train in the VR ergonomic movements in the warehouse

Part 2 of “E-Motion” is intended to revolutionize employee training at DB Schenker: the avatar is a kind of virtual trainer who gives hints so that employees can be health-conscious in their picking and packing orders, thus carrying out their jobs in a way that protects the back and joints. In order for the avatar to do this, one needs a computer-optimized motion sequence, which is then ‘pre-made’ by the avatar. This optimized motion sequence is based on a real-life sequence in the warehouse, which is recorded using motion capture technology and translated via software into an Ergonomic Assessment based on EAWS (Ergonomic Assessment Worksheet).

Everything starts with the digitized motion sequence. But it is not enough to create an optimal sequence of movements. This must then be communicated as intuitively as possible to the employees. Virtual reality creates an ideal space for learning movements, because no other medium allows a simulated physical interaction that is so similar to real-life on-the-job training.

An avatar thus translates the process optimized on the computer into an intuitively comprehensible movement. He can show the movement to the employee. If you transfer this avatar to a virtual environment, you can train employees there by showing optimal movement processes directly in the VR and the trainer if necessary pointing out and correcting wrong movements. 

Christina Kunze, Senior Project Manager – Digital Engineering / Trainings

“We are looking for a way to teach movement sequences in employee training as intuitively as possible. This is where VR really comes in handy. “

 

 First experiments and prototypes with programmed avatars have already been created and run on the PC. However, in order for the avatar to be able to respond precisely to complex motion sequences in the VR, more finetuning is required in the field of sensor technology for motion capture measurement methods. Solutions are in the works. With enough sensory feedback, the avatar will in the future be able to visualize how a sequence of movements would be better and healthier, for example, how to pack correctly and ergonomically at a packing station.

“What good does it do us if we only know in the Process Optimization department how a sequence of motion looks like? So we are looking for a way to teach movement sequences in employee training as intuitively as possible. This is where VR really stands out, “says Christina Kunze. DB Schenker is working on this mediation in cooperation with the Fraunhofer Institute and the MotionMiners. Such a VR training as a supplement and as a regular training session in addition to the conventional employee training will revolutionize the training at DB Schenker as a whole.

The project team on DB Schenker’s side consists of Christina Kunze, Christopher Habicht and Gerald Müller – Benjamin Korth from the Fraunhofer Institute and René Grzeszick and Sascha Feldhorst from the MotionMiners complete the team.

Infobox MTM

MTM (Methods-Time Measurement) is a workflow time analysis for planning manual workflows, which was already developed in its basic features in the 1920s. In MTM, all movements possible for man are broken down into smaller movement elements, so-called basic movements such as “gripping”, “releasing”, “joining”, etc. Each basic movement is provided with a letter coding – “R”, for example, stands for “reach”. For the basic movements empirically determined times are stored in tables, which are then supplemented by parameters for the distance of the movement or the weight of the object, etc. An entire sequence of movements can thus be precisely coded in writing and examined and optimized for ergonomics and efficiency. This coding is done today by an MTM Engineer software assisted on the screen, mostly by seeing a video of the motion and translating that motion into MTM encoding. MTM can be used by the process engineer as a programming language for the creation of motion sequences, if he can rely on a database with a sufficiently large number of relevant basic movements.

How do you view professional VR training?