Difference between revisions of "Portal:FACES"

From MPDLMediaWiki
Jump to navigation Jump to search
 
(6 intermediate revisions by the same user not shown)
Line 1: Line 1:
{| border="0" cellpadding="5" cellspacing="0" align="center" width="100%"
{| border="0" cellpadding="5" cellspacing="0" align="center" width="100%"
! style="background:#efefef; border:solid thin black;" |CoLab Portal for FACES
! style="background:#7FACBC; color:white; border:solid thin black;" |CoLab Portal for FACES
|}
|}
<div align="center">  
<div align="center">  
FACES is a lifespan digital collection of adult emotional facial stimuli. It contains two sets of images of naturalistic faces of 171 women and men displaying each of 6 facial expressions: neutrality, sadness, disgust, fear, anger, and happiness. The specialty of this collection is that it comprises pictures of persons of three different age groups: young (n=58), middle-aged (n=56), and older (n=57).
FACES is a lifespan digital collection of adult emotional facial stimuli. It contains two sets of images of naturalistic faces of 171 women and men displaying each of 6 facial expressions: neutrality, sadness, disgust, fear, anger, and happiness.  
To save the personal rights of the depicted persons, the pictures of FACES are only available to researchers on a case by case basis (e.g. person by person, study by study).  


The main focus of the project &ndash; next to the integration of FACES as an online open source solution in the eSciDoc architecture - would be to add new attributes to single pictures in form of standardized metadata (e.g METS) and to implement a user management for different usage rights. Therefore, the application has to be flexible to integrate further data like the outcome of the rating study.  
The main focus of the project &ndash; next to the integration of FACES as an online open source solution in the eSciDoc architecture - would be to add new attributes to single pictures in form of standardized metadata (e.g METS) and to implement a user management for different usage rights.<br />Therefore, the application has to be flexible to integrate further data like the outcome of the rating study.  
<br>
<br>
<div align="center">[[File:ESciDoc.Faces.png|link=http://faces.mpdl.mpg.de/faces/]]</div align="center">
<div align="center">[[File:ESciDoc.Faces.png|link=http://faces.mpdl.mpg.de/faces/]]</div align="center">
Line 15: Line 14:
{| border="0" cellpadding="5" cellspacing="0"  width="90%" align="right"
{| border="0" cellpadding="5" cellspacing="0"  width="90%" align="right"


! style="background:#efefef; border:solid thin black;" |FACES Partner
! style="background:#7FACBC; color:white; border:solid thin black;" |FACES Partner
|-
|-
|! style="border-left:solid thin black; border-bottom:solid thin black; border-right:solid thin black" |  
|! style="border-left:solid thin black; border-bottom:solid thin black; border-right:solid thin black" |  
Line 31: Line 30:
| width="30%" valign="top" |
| width="30%" valign="top" |
{| border="0" cellpadding="5" cellspacing="0"  width="90%" align="center"
{| border="0" cellpadding="5" cellspacing="0"  width="90%" align="center"
! style="background:#efefef; border:solid thin black" | Functionalities and Concepts
! style="background:#7FACBC; color:white; border:solid thin black;" | Functionalities and Concepts
|-
|-
|! style="border-left:solid thin black; border-bottom:solid thin black; border-right:solid thin black" |  
|! style="border-left:solid thin black; border-bottom:solid thin black; border-right:solid thin black" |  
Line 49: Line 48:


{| border="0" cellpadding="5" cellspacing="0"  width="90%" align="center"
{| border="0" cellpadding="5" cellspacing="0"  width="90%" align="center"
! style="background:#efefef; border:solid thin black" | FACES Support
! style="background:#7FACBC; color:white; border:solid thin black;" | FACES Support
|-
|-
|! style="border-left:solid thin black; border-bottom:solid thin black; border-right:solid thin black" |  
|! style="border-left:solid thin black; border-bottom:solid thin black; border-right:solid thin black" |  
Line 55: Line 54:


Please visit our [[Faces_Support|'''FACES Support Page''']] to find further information on FACES.
Please visit our [[Faces_Support|'''FACES Support Page''']] to find further information on FACES.
To get access to FACES please see and fill out the [http://colab.mpdl.mpg.de/mediawiki/File:FACES_Release_Agreement.pdf '''Release Agreement'''].


You want to contact us directly? Feel free to email the [mailto:faces-support@gwdg.de ''' FACES Support Team''']!</div>  
You want to contact us directly? Feel free to email the [mailto:faces-support@gwdg.de ''' FACES Support Team''']!</div>  
Line 62: Line 63:


[[Category:Faces]]
[[Category:Faces]]
[[Category:Portal|Faces]]
[[Category:Main Topics]]

Latest revision as of 09:03, 13 March 2014

CoLab Portal for FACES

FACES is a lifespan digital collection of adult emotional facial stimuli. It contains two sets of images of naturalistic faces of 171 women and men displaying each of 6 facial expressions: neutrality, sadness, disgust, fear, anger, and happiness.

The main focus of the project – next to the integration of FACES as an online open source solution in the eSciDoc architecture - would be to add new attributes to single pictures in form of standardized metadata (e.g METS) and to implement a user management for different usage rights.
Therefore, the application has to be flexible to integrate further data like the outcome of the rating study.

ESciDoc.Faces.png


FACES Partner

FACES is a collaboration of the MPDL with the Max Planck Institute for Human Development, Center for Lifespan Psychology, Berlin, Germany.

The project coordination is organized by Ursula Flitner, head of the institute’s library.

The content of FACES was collected between 2005 and 2007 by Ulman Lindenberger, Natalie Ebner and Michaela Riediger.

The content model of the FACES software design was influenced by the predecessor database developed at the institute by Matthias Bindernagel & Natalie Ebner.

Functionalities and Concepts
  • Picture search based on the available attributes (currently ID, age group, gender, facial expression and picture set)
  • Detailed view of single pictures (and navigation within the pictures) and their attributes
  • Creation and export of public and private subsets for the analysis of special research questions
  • Management of an unlimited quantity of attributes per picture (e.g. adding new attributes based on the output of rating studies)
  • User and rights management to realize different access rights
Functionalities    Data Model    Content Models
FACES Support

Please visit our FACES Support Page to find further information on FACES.

To get access to FACES please see and fill out the Release Agreement.

You want to contact us directly? Feel free to email the FACES Support Team!