GibbonFACS is a modification of the original human FACS system (Ekman & Friesen 1978), for use with hylobatid species (gibbons and siamangs) and was established following the same methods used in the development of ChimpFACS, MaqFACS, OrangFACS, DogFACS and CatFACS.
The resulting GibbonFACS Manual is an additional, freely available tool for scientists interested in comparative communication research, supporting already existing facial action coding systems to studying facial expressions.
GibbonFACS is a standardized system that requires certification to use. To become a certified user, the manual must be studied carefully to pass the following test. While learning to discriminate and identify AUs and ADs the detailed viewing of the video clips is essential, as the system is based on facial movement and not on stills.
The GibbonFACS Manual and the GibbonFACS Test are freely available through this website. Please contact us for the passwords. We keep a record of who is using the system so we can maintain standardisation.
For a correct visualization of the Manual, please install the latest version of Adobe Reader on your computer. To decompress the files (manual and test), please install WinRar.
Click here to download the GibbonFACS manual.
To use the system you need to take a test after training. This ensures that all users are coding in the same way, and so maintains the standardisation of the system.
To access the GibbonFACS Test clips click here. Please contact us for the passwords.
Please notice that you can attempt the GibbonFACS Test several times if you don't pass initially. For each additional attempt, however, you may need to wait for several weeks to receive your scores, especially during busy periods. It is also important that the trainee takes enough time to revise the Manual before a second attempt.
After becoming GibbonFACS certified, the coder will be able to reliably code facial movements in videos and pictures of hylobatids. High quality close-ups of the face should ideally be recorded and pictures must be compared with the neutral face of each individual, accounting for individual variation. Depending on the purpose of coding, two or more cameras should be used in synchrony (e.g. one camera zoomed in on the face and other camera recording body and context behaviours).
The GibbonFACS can be applied to investigate communication and emotion in cats through the analyses of the individual facial behaviour.
GibbonFACS was developed thanks to the joint effort of:
The authors thank Robert Zingg and Zoo Zurich (Switzerland), Jennifer Spalton and Twycross Zoo (UK), Neil Spooner & Matt Ford and Howletts Wild Animal Park (UK), Corinne Di Trani and Mulhouse Zoo (France) for allowing us to collect footage of their animals. We also thank Gill Vale for support in collecting some of the video footage, Cátia Caeiro for reliability coding and Wiebke Hoffmann for general assistance.
Psychology Department
University of Portsmouth
Portsmouth, UK
Copyright © All Rights Reserved