Character Animation Seminar Report ... - EMBOTS - DFKI

8 downloads 213 Views 357KB Size Report
In this paper, we pre- .... tion capture-driven simulation paper, the controllers and tracking systems used, .... IEEE Computer Graphics and Applications 18, 5, pp.
Character Animation Seminar Report: Complementing Physics with Motion Capture Stefan John1, and Alexis Heloir2 1

Saarland University, Computer Graphics Lab, Im Stadtwald – Campus E 1 1, 66123 Saarbrücken, Germany [email protected] 2

DFKI, Campus D 3 2, Stuhlsatzenhausweg 3, 66123 Saarbrücken, Germany [email protected]

Abstract. The interactive approach of character animation requires sophisticated motion synthesis algorithms to generate poses and motions for unpredictable events. Traditional animation techniques used alone cannot produce the needed degree of realism or the degree of controllability required for virtual characters. Motion-capture techniques and the physical simulation of characters, when combined, offer the potential to produce realistic motions while still maintaining a high level of control. A wide range of possible poses and motions thus can be generated, but determining the quality of the synthesized motion in terms of believability is important. Algorithms are created to evaluate the plausibility of the generated pose to only allow for realistic and plausible character poses. Keywords: Motion synthesis algorithms, motion capture, physical simulation, character animation, believability, interactivity.

1 Introduction Humanoid virtual characters are essential in interactive applications such as video games or training applications. Virtual characters should be controllable and autonomously perform tasks, react, respond, and anticipate. A common problem in interactive environments is un-predictable events. Situations that cannot be predefined and need sophisticated motion synthesis algorithms to generate plausible motions. Human observers are sensitive to these details and slight variations that define individual styles of movement. Based on the anatomical structure, a gender recognition is possible. Slight disturbances in the motion can produce a significant degradation of the perceived quality. The problem with is that the resulting motions are lacking important features of natural human motion. Combining motion capture and dynamic simulation is done to retain the advantages of each method and also avoid the disadvantages of each. Several hybrid dynamic/kinematic models exist. In this paper, we present a review of several state of the art methods aimed at achieving convincing anima-

tion by mixing procedural animation, based on dynamic models, and motion capture. The first section of this report shows related work done in the field. The second and third section go into detail about two particular papers introduced in the related work. The forth section presents more recent work done in the field.

2 Related Work Complementing motion capture and physics is an interdisciplinary field of research applying knowledge of areas from robotics, biomechanics, computer graphics, and artificial intelligence. The following sections give an overview of important related work researched in the field of motion capture editing, physically-based characters, and the combination of motion capture with physics. 2.1 Motion Capture Editing Editing motion capture data is a method, which consists of modifying motions without re-recording them. It allows to generalize the data and reuse it effectively. The technique of motion warping is proposed by Witkin [1], and allows the editing a captured animation by warping the motion parameter curves. The method creates smooth deformations that preserve the fine structure of the original motion. Parameter curves can also be overlapped and blended in order to combine different motion clips. Adapting the motions to new characters is called retargeting and has been addressed by Gleicher [2]. When retargeting the motion certain constraints, such as the contact with the environment, must be maintained. When manually or automatically editing motion capture data the edits change the velocities of the joints. The disturbances in the velocity can be visually detrimental and a method to minimize the velocity disturbances by processing the motion capture data to achieve positional accuracy is proposed by Choi [3]. Another method to create new motions is by combining existing motion capture sequences. In [4] the authors decomposed motions into their respective frequency bands as a form of motion signal processing and also allowed to control the blending gains for the bands. Unuma [5] interpolated and transitioned between motion sequences by using a Fourier series expansion. To align key events in a blended motion, Rose [6] proposed to use radial basis functions and time warping methods. In order to create a larger motion library from which motions to choose Wiley [7] proposed to resample motion examples with linear interpolation and timescaling. Rearranging pieces of previously recorded motions is another approach of motion capture editing. The main problem here is to find a re-arrangement that meets certain criteria.

Fig. 1. Motion graphs clustering frames into groups as introduced by Kovar [11].

Molina [8], Arikan [9], and Lee [10] present different methods how to perform the combinatorial search. Kovar [11] presents the concept of motion graphs as illustrated in figure 1. The main problem is that the methods cannot synthesize truly novel body configurations, because every frame is taken from the original motion collection. 2.2 Physically-Based Characters In [12] the topic of animating human athletics is detailed and a method presented to animate human athletics in respect to the physically laws. Running, gymnastics, and diving motions are generated by using hand-tuned, state machine-driven controllers. To produce physically realistic, periodic walking and running motions Lazlo [13] presented an approach with a limit cycle control. The authors started with an unstable open-loop controller and perturbed the parameters to obtain a stable cyclic movement. Designing physically-based controllers for individual behaviors is one solution, but combining several controllers is a better approach to generate more complex motions. Wooten [14] created parameterized controllers for simulated leaping, tumbling, landing and balancing, and concatenated them to create gymnastic behaviors, such as diving and flipping. Faloutsos [15] introduced composable controllers for physics-

based character animation by automatically combining primitive controllers based on their pre-conditions for success. In [16] Faloutsos added to their virtual stuntman a repertoire of autonomous motor skills, such as recovery motions and reactions to falling. 2.3 Combining Motion Capture and Physics To produce a variety of modifications of motion data physical models are used to constrain possible solutions. Popovic [17] proposed a physically-based motion transformation by using a low resolution physical model applied to running and jumping motion data. Pollard [18] introduces simple machines for scaling human motion. Running motions are edited by using a low degree-of-freedom model to constrain computation needs and result in a faster solution. To generate efficient motion transitions Rose [19] proposed a method using space-time constraints. They used an inverse dynamics model to transition between motion sequences by finding a minimum energy solution. Tracking and modifying upper-body human motion data with a dynamic simulation is the focus of [20]. Goal is to create physically plausible motions and respective transitions for upper-body behaviors. Playter [21] combined a physicsbased simulation and motion capture with a controller for running motions. A usercontrolled physics-based animated articulated figure is presented in [22]. The goal is to modify motion to include physically based reactions to gravity or external forces. Oshita [23] presented a dynamic motion control technique for human-like articulated figures. A response motion of a character is animated reacting to a mass being dropped on the character's back. The following sections detail the approaches of basic and recent papers in the area of combining motion capture with physics. Section 3 shows the approach of Zordan [24] to create motion capture-driven simulations that hit and react. Section 4 covers the approach of Arikan [25] to create physicallyplausible motions for pushing people around situations. Section 5 covers a more recent approach and shows how physically-based grasping can be realized [26].

3 Motion Capture-Driven Simulations The following sections describe the behavior modeling approach chosen for the motion capture-driven simulation paper, the controllers and tracking systems used, and the evaluation performed to decide upon the quality of the synthesized motions. 3.1 Behavior Modeling Several different animation methods exist, such as hand-keyed animation, forward/inverse kinematics, human motion capturing, motion graphs, or procedural animation, such as physically-based simulation. Each has its own advantages and disadvantages and has to be evaluated for different criteria. Motion capturing is a wide used technique but has its limitations. Not everything is recordable and some motions are dangerous for the motion actors to per-form. Motions such as boxing where the

impact would be too great to be performed by an actor cannot be recorded. A way is to combine the different approaches as shown in figure 2.

Fig. 2. Combining different animation techniques to retain the advantages and avoid the disadvantages of the several methods.

The goal is to try to quit data-centric approaches such as motion graphs and the so called cluster forest of motion clustering techniques because of their low semantics. Behaviors are buried under the complexity of the structures.

Fig. 3. The behavioral transition diagram for the boxing example.

Motions should be organized by adding a new level of abstraction to build upon behavior modeling. A higher-level graph is required to emphasize on behavior. The goal is to model animation with behavioral transition diagrams. For the boxing example a behavioral transition diagram is depicted in figure 3. Sneaking, bumping, hitting,

and balancing are displayed as nodes. For every node the corresponding animation technique is displayed. Possible transitions between nodes are displayed by arrows. 3.2 Controllers and Tracking System The tracking system, depicted in figure 4, uses raw motion data and converts it to desired joint angles which are then fed into the tracking controller that computes the torques for the dynamic model. Also the balancing controller feeds torques into the dynamics simulation.

Fig. 4. Overview over the system with tracking controller, dynamic model, collision handler, task controller, and balance controller.

They are integrated and the new joint positions and angles are fed into the collision handler which checks for penetration of the collision objects and introduces reaction forces into the simulation. A task controller uses inverse kinematics methods to control actions such as hitting.

Fig. 5. The tracking controller tries to match the motion capture input data as close as possible by using an inverse dynamics system.

Fig. 6. Influence values control the mix of the different techniques.

Trajectory tracking is done by using inverse dynamics to follow the motion capture data and smoothly return to tracking as depicted in figure 5. It introduces torques to the simulation. An inverse kinematics chain is used to compute the arm orientations and a task controller for the hit target definition.

Animation controllers must satisfy the compromise between keeping the recorded motion capture data as it is and conveying the adaptation. Controllers determine the separate influences of basic motion capture animation, the balance control, the physical simulation, and the inverse kinematics, as depicted in figure 6. Unpredicted events are important for interactive applications where the animation needs to adapt during runtime.

Fig. 7. Parallelism of controller threads.

Some behaviors are executed in parallel and behavior threads should be interruptible. In some situations physical correctness is important while in other situations kinematic correctness is needed, for example when a character gets hit or is hitting. The balance control thread is running in parallel and is always trying to balance the character as shown in figure 7. 3.3 Evaluation To evaluate the method it is important to compare the synthesized motion side-byside with recorded video material. The robustness of the physical simulation is a general design consideration. The overall computation must be reduces to achieve real time. The granularity of the simulation has to be considered and certain simplifications and approximations have to be made. Dividing the character into two parts, i.e. the upper and lower body, is one way, and also removing parts of the body with a low mass can help to cut the computation cost and lower the numerical stiffness of the overall system. In order to achieve real-time the physical realism has to be weighted against interactivity.

Fig. 8. Side-by-side comparison of recorded live video footage and the simulation.

A side-by-side comparison can be seen in figure 8. Recorded live video footage is used to compare the synthesized motions with real boxing examples. The example shows that per-forming such dangerous motions by an actor for motion capturing is too dangerous, but by combining motion capture with a physical simulation can be avoided and still produce convincing and realistic motions. Investigating the video material, provided by the authors, in de-tail, reveals that only collisions between a hand and the opponents upper body and head are physically simulated, whereas intercollisions between the boxing gloves are neglected. Thus the arms can penetrate each other, leading to unrealistic motions. Because the arm positions are computed as inverse kinematics chain, when hitting, they remain stiff even though the impact of the hand leads to reaction motions based on the resulting impact forces. The authors comment on the neglected simulated parts as their approximation approach.

4 Pushing People Around The pushing people around paper introduces an oracle which is explained in this section. The oracle recognizes visually realistic motions. Selected example poses of a deformed motion are shown in figure 9.

Fig. 9. Motion deformation with different parameters yielding unrealistic poses.

An oracle is introduces to yield visually plausible modifications of a given motion by estimating the deformation parameters. The oracle is trained to decide between

good and bad looking motions in order to produce realistic motions and to quickly find good response motions. Different deformation parameters can be chosen but yield implausible motions.

5 Physically-Based Grasping In the following section the approach of physically-based grasping is shown. When animating human characters it is important to capture even small movements for example interactions with the environment using the hands. Motion capture techniques deliver realistic motions but when contact with the environment or other subtle interactions are involved it becomes difficult to edit and retarget the motions. It leads to a reduced realism of the produced animation when grasps or handshakes do not look secure or when impacts of falling objects on hands do not lead to the expected motions. Physically-based simulations for hand animations can preserve the plausibility of the generated motion, as depicted in figure 10.

Fig. 10. The collision model for the hand and an example of a generated handshake.

The controller for physically-based grasping uses a single motion capture example to form grasps of different object geometries and two-hand interactions, such as handshakes, using a physical collision model. A generated handshake sequence obtained from a single motion capture example an adapted to the situation.

Fig. 11. State diagram of the hand actions.

A hand state diagram with the neutral, opening, closing, gripping, releasing, opening, relaxing states can be seen in figure 11. A rigid-body articulation is used for the physical simulation of the hand and allows for simple interactive object manipulation.

6 Conclusion A higher-level behavioral graph is important instead of more low-level approaches such as motion graphs when it comes to realistic character animation. Combining motion capture techniques and dynamic simulation offers an interesting solution to maintain realism to ultimately achieve controllable characters for interactive virtual environments. A more correct interaction with the surrounding environment is possible than it would be using only traditional animation approaches without combining them.

References 1. Witkin, A., Popovic, Z.: Motion Warping. In: Proceedings of SIGGRAPH 95, ACM SIGGRAPH, pp. 105–108 (1995) 2. Gleicher, M.: Retargeting Motion to New Characters. In: Proceedings of SIGGRAPH ’98, ACM SIGGRAPH, pp. 33–42. (1998) 3. Choi, K.J., Park, S.H., Ko, H.S.: Processing Motion Capture Data to Achieve Positional Accuracy. Graphical models and image processing: GMIP 61, 5, pp. 260–273 (1999) 4. Bruderling, A., Williams, L.: Motion Signal Processing. In: Proceedings of SIGGRAPH 95, ACM SIGGRAPH, pp. 97–104. (1995) 5. Unuma, M., Anjyo, K., Takeuchi, R.: Fourier Principles for Emotion-Based Human Figure Animation. In: Proceedings of SIGGRAPH ’95, ACM SIGGRAPH, pp. 91–96 (1995) 6. Rose, C., Cohen, M., Bodenheimer, B.: Verbs and Adverbs: Multidimensional Motion Interpolation. IEEE Computer Graphics and Applications 18, 5, pp. 32–40 (1998) 7. Wiley, D.J., Hahn, J.K.: Interpolation Synthesis of Articulated Figure Motion. IEEE Computer Graphics & Applications 17, 6, pp. 39–45 (1997)

8. Molina-Tanco, L., Hilton, A.: Realistic Synthesis of Novel Human Movements from a Database of Motion Capture. In: Workshop on Human Motion, pp. 137–142 (2000) 9. Arikan, O., Forsyth, D.A.: Interactive Motion Generation from Examples. In: Proceedings of the 29th annual conference on Computer graphics and interactive techniques, ACM Press, pp. 483–490. (2002) 10.Lee, J., Chai, J., Reitsma, P.: Interactive Control of Avatars Animated with Human Motion Data. In: Proceedings of the 29th annual conference on Computer graphics and interactive techniques, ACM Press, pp. 491–500. (2002) 11.Kovar, L., Gleicher M., Pighin, F.: Motion Graphs. In: Proceedings of the 29th annual conference on Computer graphics and interactive techniques, ACMPress, pp. 473–482 (2002) 12.Hodgins, J.K., Wooten, W., Brogan, D., O'Brien, J.: Animating Human Athletics. In: Proceedings of SIGGRAPH ’95, ACM SIGGRAPH, pp. 71–78 (1995) 13.Laszlo, J.F., Van De Panne, M., Fiume, E.: Limit Cycle Control and Its Application to the Animation of Balancing and Walking. In: Proceedings of SIGGRAPH ’96, ACM SIGGRAPH, pp. 155–162 (1996) 14.Wooten, W.L., Hodgins, J.K.: Simulation of Leaping, Tumbling, Landing, and Balancing Humans. IEEE International Conference on Robotics and Automation (2000) 15.Faloutsos, P., Panne, M., Terzopoulos, D.: Composable Controllers for Physics-Based Character Animation. In: Proceedings of SIGGRAPH 2001, ACM SIGGRAPH, pp. 251–260 (2001) 16.Faloutsos, P., Panne, M., Terzopoulos, D.: The Virtual Stuntman: Dynamic Characters with a Repertoire of Autonomous Motor Skills. Computers & Graphics 25, 6, pp. 933–953 (2001) 17.Popovic, Z., Witkin, A.: Physically Based Motion Transformation. In: Proceedings of SIGGRAPH 99, ACM SIGGRAPH, pp. 11–20 (1999) 18.Pollard, N.S.: Simple Machines for Scaling Human Motion. In: Computer Animation and Simulation ’99, Eurographics Animation Workshop, pp. 3–11 (1999) 19.Rose, C., Guenther, B., Bodenheimer, B., Cohen, M.F.: Efficient Generation of Motion Transitions Using Spacetime Constraints. In: Proceedings of the 23rd annual conference on Computer graphics and interactive techniques, ACM Press, pp. 147–154 (1996) 20.Zordan, V.B., Hodgins, J.K.: Tracking and Modifying Upper-body Human Motion Data with Dynamic Simulation. In: Computer Animation and Simulation ’99, Eurographics, pp. 13–22 (1999) 21.Playter, R.: Physics-Based Simulation of Running Using Motion Capture. In: Course notes for SIGGRAPH 2000, ACM SIGGRAPH (2000) 22.Kokkevis, E., Metaxas, D., Badler, N.: User-Controlled Physics-Based Animation for Articulated Figures. In: Proceedings of Computer Animation 1996 Conference, pp. 16–26 (1996) 23.Oshita, M., Makinouchi, A.: A Dynamic Motion Control Technique for Human-Like Articulated Figures. Computer Graphics Forum 20, 3 (2001) 24.Zordan, V.B., Hodgins, J.K.: Motion Capture-Driven Simulations that Hit and React. In: Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation, ACM Press, pp. 89-96 (2002) 25.Arikan, O., Forsyth, D.A., O’Brien, J.F.: Pushing People Around. In: Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation, ACM Press, pp. 59–66 (2005) 26.Pollard, N.S., Zordan, V.B.: Physically Based Grasping Control from Example. In: Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation, ACM Press, pp. 311–318 (2005)