A Two-Part Transformer Network for Controllable Motion Synthesis
2
Citation
62
Reference
10
Related Paper
Citation Trend
Abstract:
Although part-based motion synthesis networks have been investigated to reduce the complexity of modeling heterogeneous human motions, their computational cost remains prohibitive in interactive applications. To this end, we propose a novel two-part transformer network that aims to achieve high-quality, controllable motion synthesis results in real-time. Our network separates the skeleton into the upper and lower body parts, reducing the expensive cross-part fusion operations, and models the motions of each part separately through two streams of auto-regressive modules formed by multi-head attention layers. However, such a design might not sufficiently capture the correlations between the parts. We thus intentionally let the two parts share the features of the root joint and design a consistency loss to penalize the difference in the estimated root features and motions by these two auto-regressive modules, significantly improving the quality of synthesized motions. After training on our motion dataset, our network can synthesize a wide range of heterogeneous motions, like cartwheels and twists. Experimental and user study results demonstrate that our network is superior to state-of-the-art human motion synthesis networks in the quality of generated motions.Keywords:
Motion Capture
Motion Capture
Facial motion capture
Cite
Citations (7)
Motion Capture
Cite
Citations (3)
Motion capture technology has been widely used for creating character motions. Motion editing is usually also required to adjust captured motions. Because character poses which include joint rotations, body positions, and orientations are high-dimensional data, it is difficult to manipulate character poses through a conventional mouse-based interface. We propose a motion editing system that uses a motion capture device. Our system can capture a motion and edit the captured motion using the same motion capture device. Our motion-capture-based interface can specify motion editing parameters such as time period, body part selection, end-effector position, pose, and motion segment. We conducted a user study to compare our system and conventional mouse-based motion editing system. The results showed that our interface is more efficient than the conventional interface.
Motion Capture
Interface (matter)
Position (finance)
Cite
Citations (2)
This research is aimed to study the detail of motion of various subjects with differences in physical attributes. The research outlines how different physique produces different behavioural patterns based upon mass and proportion. This research considers 'motion' to identify the differences in each subject's physical attributes. By sampling various subjects of physical differences, this research concentrates on acquiring various motion parameters based on certain predefined actions and captured data. The experiment utilises Vicon8i Optical Motion Capture system to study the detail of human motion by extracting the subjects' core motions for analysis. In addition, the research also applies motion editing techniques by using Motion Builder to retarget and constrain the captured data. The outcome of the research serves as a guideline to understand the basic flow of motion for looping motions. The research provides basic data sets based on the sampled subject variables to be used as a reference and building block for future research. This research is especially helpful to researchers and students venturing into Mocap for animation.
Motion Capture
Motion analysis
Human motion
Cite
Citations (1)
Despite an increasing number of acclaimed abstract animations being created through the application of motion capture technologies there has been little detailed documentation and analysis of this approach for abstract animation production. More specifically, it is unclear what the key considerations are, and what issues practitioners might face, when integrating motion capture movement data into their practice. In response to this issue this study explored and documented the practice of generating abstract visual and temporal artefacts from motion captured dance movements that compose abstract animated short films. The study has resulted in a possible framework for this form of practice and outlines five key considerations which should be taken into account by practitioners who use motion capture in the production of abstract animated short films.
Motion Capture
Dynamics
Cite
Citations (1)
The purpose of the research is to study the detail of motion of various subjects with differences in physical attributes. The research outlines how different physique produces different behavioural patterns based upon mass and proportion. This research focuses on "walk motion' to identify the differences in each subject's physical attributes by sampling subjects of physical differences. This experiment employs Vicon8ireg optical motion capture system (MOCAP) to study the detail of human motion by extracting the subjects' core motions for analysis with pre-defined actions. The research used the findings to establish the relationship between height and weight against motion frequencies in 3D space.
Motion Capture
Motion analysis
Sequence (biology)
Cite
Citations (0)
The passing on and preserving of advanced technical skills has become an important issue in a variety of fields, and motion analysis using motion capture has recently become popular in the research of advanced physical skills. This research aims to construct a system having a high on-site instructional effect on dancers learning Nihon Buyo, a traditional dance in Japan, and to classify Nihon Buyo dancing according to style, school, and dancer's proficiency by motion analysis. We have been able to study motion analysis systems for teaching Nihon Buyo now that body-motion data can be digitized and stored by motion capture systems using high-performance computers. Thus, with the aim of developing a user-friendly instruction-support system, we have constructed a motion analysis system that displays a dancer's time series of body motions and center of gravity for instructional purposes. In this paper, we outline this instructional motion analysis system based on three-dimensional position data obtained by motion capture. We also describe motion analysis that we performed based on center-of-gravity data obtained by this system and motion analysis focusing on school and age group using this system.
Motion Capture
Motion analysis
Center of gravity
Cite
Citations (4)
Motion capture Instruments was a kind of high technology equipment that was used for measuring accurately the motion condition of moving objects in 3D space. Motion capture Instruments applied in physical training could help coaches observe and monitor the athletes' techniques from different perspectives, and obtain many motion parameters data of specific technical action and physiological and biochemical indexes data, and then statistic the law of motion, so as to provide standard technical guidance for scientific training. The concept of motion capture, the principle of motion capture, the classification and characteristic of motion capture were introduced in the paper.
Motion Capture
Statistic
Cite
Citations (1)
We describe a method that generates dance motions with human emotions from motion-capture data. To generate the dance motions, we developed an emotional motion editor (EME). The EME adds human emotions to the dance motions by modifying the original motion-capture data interactively: for instance, by changing the speed of motion or by altering the joint angles. To evaluate the emotional expressions in the dance motions generated by EME, we performed an assessment experiment by conducting a questionnaire survey, and we examined the results with the statistical t-test. As a result, we confirmed that the dance motions with human emotions are obtainable on the EME by just adjusting a few of its parameters.
Motion Capture
Human motion
Virtual actor
Cite
Citations (2)