Skip to Content

Department of Psychology

About the Lab

Background

Under the direction of Professor Bill Thompson, the Macquarie Music, Sound and Performance (MSP) Lab was established in 2008 to foster cross-disciplinary motion-analysis research.

The MSP lab was made possible by a Macquarie University Strategic Infrastructure Scheme (MQSIS) in 2007 and 2008, awarded to: Bill Thompson, Arthur Shores, Richard Stevenson, Trevor Case, Mark Williams, Doris McIlwain, John Sutton, and Romina Palermo.

The MSP lab consists of an 8-Camera Vicon passive marker system for optical motion analysis. High-resolution MX-F20 cameras (4x) allow for the capture fine-grained motion (0.1 millimeters). Operating at over 250 frames/second, the Vicon system allows for movement analyses ranging from full three-dimensional gait analysis to fine-grained facial expression.

Lab Infrastructure

A Vicon camera4x MX-F20 2.0 Megapixel cameras
4x MX13+ 1.3 Megapixel cameras
1x MX Ultranet HD (supports 244 cameras)
1x 64-channel Analogue card (audio, force-plate)
1x Windows PC, Intel 2.4Ghz Core 2 Duo
1x Macbook Pro, 15" 2.4Ghz Core 2 Duo
1x Digidesign MBox2 recording studio
1x Rode K2 condenser microphone
1x Sony Handycam (40GB 1MP Hard Drive Hybrid DCRSR65)
1x Sennheiser HD555 headphones
2x Sennheiser HD515 headphones
Vicon Nexus, Pro Tools, Final Cut Studio 2

The Macquarie MSP Lab is an 8-Camera Vicon passive marker system with streaming video and sound

What It Means For You

The MSP lab was established to foster cross-disciplinary research, and to establish Macquarie as an leading institution for motion analysis in the cognitive sciences. Whether for fine-grained facial analysis, or full-body motion, the Macquarie MSP Lab has the tools to support your research.

How It Works

Vicon cameras emit infer-red light using an LED array, which is reflected by the subject's body markers (3mm to 2cm), and picked up by the camera's photo-sensor. Each camera produces a 2-dimensional coordinate for the marker. Based on the geometry of the capture volume, the MX Ultranet combines all these 2D data points into a single 3D coordinate, which is fed into the PC in real-time.

Vicon diagram

What It Can Be Used For

The Vicon optical motion system is clinically validated for the three dimensional assessment of gait in children and adults. This extensible system allows for the integration of video, sound, force plates, and EMG.

Vicon Nexus allows for integrated motion, video, and sound during capture and editing.

What We’ve Used It For

Experiment I
Our first investigation examined the multimodal aspects of music: how structural and affective interpretations of music are mapped onto facial expressions of performers. Integrated motion capture, video, and sound were analysed to elucidate the motor planning behaviours of singers. Based on the system's fine-grained ability, we could determine that singers produce subtle, unconscious facial mirroring when exposed to emotional stimuli (1-2mm movements).

Experiment II
Our second investigation examined how singers convey information about musical structure in their facial expressions. Musicians were asked to sing a series of musical intervals, while tracked with the motion system. We found that singers use facial expressions, such as eyebrow height, to convey changes in melodic pitch.

User Manual

Available here.



Vicon infrared lens

Jump to

Background
Lab Infrastructure
What It Means For You
How It Works
What It Can Be Used For
What We've Used It For
User Manual



Recent News

PNAS article features on ABC and The Conversation 19 Oct 12

Bill was interviewed about his research on congenital amusia (tone deafness), recently published in the Proceedings of… more >>

MSPL paper features in Motherboard article 27 Apr 12

A 2012 paper by members of MSPL has featured in an article published by online magazine Motherboard.
more >>

Show all >>