The EyesWeb Tutorial aims at sharing with participants the experience of Casa Paganini – InfoMus in scientific research, technological. This paper introduces the EyesWeb XMI platform (for eX- tended Multimodal .. A one-week tutorial, the Eye-. sWeb Week is organized every. yourself, this is a good place to start. Further tutorials can also be found on the Eyesweb website under
|Published (Last):||18 September 2017|
|PDF File Size:||16.29 Mb|
|ePub File Size:||3.18 Mb|
|Price:||Free* [*Free Regsitration Required]|
In this webpage you can find instructions necessary to download, install and run the DANCE software platform. Details about the platform architecture and data stream formats are provided in Deliverable 4. EyesWeb XMI is a modular system that allows both expert e.
EyesWeb provides software modules, called blocksthat can be assembled intuitively i. The last version of EyesWeb is the 5. You can download it tutorkal the following link: The DANCE example tools and tuforial are programs, written to be execute by EyesWeb, that allow the user to record, playback and analyze multimodal data video, audio, motion capture, sensors.
To run tools you will need to download the corresponding installers, launch them and execute the tools as normal Windows applications.
To run patches you will need to download and load them into the EyesWeb application see step 1 on how to download and install EyesWeb. The current version of the DANCE example tools and patches includes applications allowing you to perform different tasks:. The recording tool records avi files. The video is encoded in MPEG-4 format, the resolution is x and the framerate is 50 fps. Audio is encoded in AAC format at Hz. Two channels are recorded: Multiple instances of the video recorder tool can be started and can work standalone, or synchronized with the other recorders.
Uttorial options panel allows you to configure the working mode of the recorder. Audio is sampled at Hz. The user interface is very similar to the video recorder tool. The main difference is the visualization part. In this tool the audio waveform is shown instead of the video stream.
The graph in the figure shows the values selected by the user one among accelerometer, gyroscope, or compass for each of the 4 IMUs. In the lower left part of the recorder interface you can read the current streaming framerate related to each sensor Below the graph you can read eyesseb the trial name and the reference clock.
The data is saved by the recording tool in CSV format. The control type section controls the synchronization mode. In slave mode the tool receives the clock time from an external device the master. Once you recorded some audio, video and IMU data, you can eyeswdb it back using the playback patch. Download the patch and copy the downloaded file to the parent folder of the recorded data. Tutprial you did not record any data you can download some sample data from this website.
When changing from a recording to another you have eyesaeb to stop the currently played segment and then you can start the new one. During the playback of a file, video, 3D mocap data and IMU signals will be displayed.
Now that you recorded or downloaded some multimodal data and you can successfully play it back, you can procced by performing some analysis on it. In the DANCE project we aim to innovate the state of art on the automated analysis of the expressive movement. We consider movement as a communication channel allowing humans to express and perceive implicit high-level messages, such as emotional states, social bonds, and so on. To study it, we focus on the sets of non-verbal expressive features that are described in detail in Deliverable 5.
The following expressive features can be extracted on multimodal data using the patches you can download below:. Besides the above expressive features, we are interested in extracting analysis primitives: The simplest unary analysis primitives are statistical moments e. Further examples of unary operators, that are more complex, include shape e.
Models for predictions e. Binary and n-ary operators can eyesweeb applied e. For example, synchronization can be used to assess coordination between hands.
Causality can provide information on whether the movement of a joint leads or follows the movement of another joint. The following analysis primitive can be extracted on multimodal data using the patches you can download below:. The links reported below summarize the patches for computing features and analysis primitives from IMUs. To use and test the patches:. The links reported below summarize the main patches for computing features and analysis primitives from motion capture data.
As reported in the above paragraphs, you have to download and extract some sample data in order to run the DANCE example patches. Without some sample data the example patches will not start, or will start tjtorial will not provide any output.
The sample data is contained in a zip file and it is a collection of 2 trials consisting in data recorded by a motion capture system, a videocamera, and 4 IMU sensors placed eyeswb the dancer’s limbs wrists and ankles. The zip archive contains the following folders and files:.
If you downloaded EyesWeb, you installed it and you downloaded some example patches plus the needed sample data you are ready to run the patches:. Performance at “La Lanterna”, Rome March 23rd This performance took place in occasion of the dinner at the Download the DANCE example tools and patches The DANCE example tools and patches are programs, written to be execute by EyesWeb, that allow the user to record, playback and analyze multimodal data video, audio, motion capture, sensors.
The current version of the DANCE example tools and patches includes tutoroal allowing you to perform different tasks: Audio recorder tool download installer The audio recorder tool is depicted in the below: Recorder tools options panel All the recording tools share the same options panel depicted below: Playback patch Once you recorded some audio, video and IMU data, you can play it back using the playback patch.
You will see the following screen: Patches tuutorial analyzing multimodal data Now that you recorded or downloaded some multimodal data and you can thtorial play it back, you can procced by performing some analysis on it. The following expressive features eyeswbe be extracted on multimodal data using the patches you can download below: This feature indicates whether the movement is performed slowly or not.
This feature is based on Energy and Slowness. If movement exhibits high respectively, low slowness and no respectively, many energy peaks are detected then smoothness is high respectively, low. Rudolf Laban and Frederick C. It is computed by extracting the Energy vertical component normalized to the overall amount of Energy in the movement. It is tuttorial using alfa-stable distributions. An alpha-stable fit is performed on peaks of accelerations.
EyesWeb Week – Wholodance website
A movement is sudden when the product between alpha and gamma is high see Deliverable 2. The algorithm takes as input the 3D joint accelerations on a time window on which the suddenness has to be computed, and then it fits it into the alfa-stable distribution. The output of the app gets close to 1 i.
An impulsive movement can be performed by a part of the body or by the whole body and is characterized by the following properties: P1 it is sudden, that is, it presents a high variation of speed either from low to high or from high to low ; P2 it is executed with no preparation.
A Fluid movement can be performed by a part of the body or by the whole body and is characterized by the following properties: P1 the movement of each involved joint of the part of the body is smooth, following the standard definitions in the literature of biomechanics; P2: That is, there is an efficient propagation of movement along the kinematic chains, with a minimization of dissipation of energy.
Fluidity is computed as the distance between the evolution in time of Humanoid Mass-Spring model i. The following analysis primitive can be extracted on multimodal data using the patches you can download below: It is an important concept in human-human communication that has been widely addressed by the HCI research community and in movement studies. We split the patches for analyzing multimodal data in 2 groups: To use and test the patches: Download the IMU and motion capture sample data As reported in the above paragraphs, you have to download and extract some sample data in order to run the DANCE example patches.
The zip archive contains the following folders and files: Run EyesWeb XMI, load one or more patches and execute them If you downloaded EyesWeb, you installed it and you downloaded some example patches plus the needed sample data you are ready to run the patches: Patches computing features from motion capture data you need the sample motion capture data to run these patches.