The Evolved Person Perception & Cognition Lab

FACE STIMULI (mostly)

facestmuli.jpg

In no particular order....

(please note that you will probably need to seek permission to use these resources)

But before that a bit of fun: 50 things that look like faces



  • The AI Generate faces database

    • 200,000 + faces generated by AI. Mildly terrify in their authenticity but very useful for face researchers. There are some limits in the ranges but they differ in terms of

      • Ethnicity (White, Black, Latino, Asian),

      • Sexes (Male, Female),

      • Ages (Young adult, Adult, Child, Middle-aged, Infant)

      • Hair length (Short, Medium, Long) and colour (Brown, Black, Blond, Gray)

      • Eye colour (Brown, Grey, Blue, Green)

      • Emotion (Joy, Neutral).

      • Natural of beautified.

      • Head pose (Left, Right, or Front facing)

      • Background colour


    No rating norms as yet.


    • You can also upload photos to create anonymised versions (they take your photo and re generate it). Useful for legitimate security profiles and or stimuli datasets.

 

  • Glasgow Face Database: Burton et al.'s
    Glasgow Face Matching Test (GFMT) and Glasgow Unfamiliar Face Database (GUFD) are now hosted in York in the York FaceVar Lab pages.

 

  • Tarr faces from the tarrlab at the Carnegie Mellon University. You can also try Face Place which contains multiple images of over 200 individuals of various races.

 

  • FAMED (Faces and Motion Exeter Database. This a database of 32 male actors in various formats (blurred, pixelated etc).

 

  •  Face Image Meta-Database (fIMDb) - version 2   a collection of resources from Cill Workman (brain enthusiast). there’s allsorts of useful face stuff here. the link for the data base goes toa database spreadsheet with lots of links from there.

  • NIST Color FERET from The National Institute of Standards and Technology (NIST): http://www.nist.gov/itl/iad/ig/colorferet.cfm.

 

 

  • Caucasian Faces - site and database was designed and maintained by Dr Libor Spacek at University of Essex.


  • The Chicago Face Database (extended) - CFD. This is a database of face images that contain both the original set (main CFD) of faces and a number of extension face sets. The faces have all be standardised, aged between 17-65 years (rage of measurements). Developed by Debbie S. Ma, Joshua Correll, and Bernd Wittenbrink, at the University of Chicago.

    • Main CFD - contains (597) Asian, Black, Latino, and White faces (female and male). cite Ma, Correll, & Wittenbrink (2015). The Chicago Face Database: A Free Stimulus Set of Faces and Norming Data. Behavior Research Methods, 47, 1122-1135. https://doi.org/10.3758/s13428-014-0532-5.

    • CFD-MR - contains (88) mixed race faces. Cite Ma, Kantner, & Wittenbrink, (2020). Chicago Face Database: Multiracial Expansion. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01482-5.

    • CFD-INDIA - contains (142) Indian faces (recruited in Dehli). Cite: Lakshmi, Wittenbrink, Correll, & Ma (2020). The India Face Set: International and Cultural Boundaries Impact Face Impressions and Perceptions of Category Membership. Frontiers in Psychology, 12, 161. https://doi.org/10.3389/fpsyg.2021.627678.

 

 

  • Center for Vital Longevity's Face Database - has been setup by Meredith Minear and Denise Park at The Center for Vital Longevity Face Database , University of Michigan). There’s quite a range of stimuli here including different races, emotions and orientations.

 

  • Labeled Faces in the Wild Home - [based at he Computer Science Department at the University of Massachusetts, this is, "a database of face photographs ] designed for studying the problem of unconstrained face recognition. The data set contains more than 13,000 images of faces collected from the web. Each face has been labelled with the name of the person pictured. 1680 of the people pictured have two or more distinct photos in the data set. The only constraint on these faces is that they were detected by the Viola-Jones face detector. More details can be found in the technical report below."

  • Computational Models of Kinship Verification - from the Advanced Multimedia Processing (AMP) Lab (Ruogu Fang, Kevin D. Tang, Noah Snavely, Tsuhan Chen)

    • Contains 143 pairs of parents and children kinship database. Publically available but please cite cite the following paper:

      • Ruogu Fang, Kevin D. Tang, Noah Snavely, Tsuhan Chen. "Towards Computational Models of Kinship Verification", IEEE International Conference on Image Processing, 2010.

  • The Dartmouth Database of Child Faces contains images of 40 male and 40 female models between the ages of 6 and 16.

  •  UB KinFace Dataset (version 2) From Northwestern University.

    • Contains 600 images of 400 people (divisible into 200 groups of images - child, young parent - old parent images). Mmainly celebrities and politicians.

    • Non-commercial use only. Please cite the following references if you use the set.

      • Siyu Xia, Ming Shao, Jiebo Luo, and Yun Fu, “Understanding Kin Relationships in a Photo”, IEEE Transactions on Multimedia (T-MM), Volume: 14, Issue: 4, Page(s): 1046-1056, 2012.

      • Siyu Xia, Ming Shao and Yun Fu, “Kinship Verification through Transfer Learning,” International Joint Conferences on Artificial Intelligence (IJCAI), pp. 2539-2544, 2011.

      • Ming Shao, Siyu Xia and Yun Fu, “Genealogical Face Recognition based on UB KinFace Database,” IEEE CVPR Workshop on Biometrics (BIOM), 2011.

  • The Belfast Natural Induced Emotion Database - from Sneddon, McRorie, McKeown & Hanratty at Queens University Belfast. The database contains video recordings of mild/moderate emotional responses to lab based emotion inducing tasks (the files are quite big).

 

  • Ebner faces MAX Plank Institute: Face database includes older faces and emotional faces.

FACES is a lifespan digital collection of adult emotional facial stimuli. It contains two sets of images of naturalistic faces of 171 women and men displaying each of 6 facial expressions: neutrality, sadness, disgust, fear, anger, and happiness. The specialty of this collection is that it comprises pictures of persons of three different age groups: young(n=58), middle-aged (n=56), and older (n=57)."

· Cite: Ebner, N., Riediger, M., & Lindenberger, U. (2010). FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation. Behavior research Methods, 42, 351-362. doi:10.3758/BRM.42.1.351

 

  • The MR2, is a multi-racial, mega-resolution database of facial stimuli, that contains 74 full-colour images of men and women of European, African, and East Asian descent.Reference:

· Cite: Strohminger, N., Gray, K., Chituc, V., Heffner, J., Schein, C., and Heagins, T.B. (in press). The MR2: A multi-racial mega-resolution database of facial stimuli. Behavior Research Methods.

 

 

  • The OUI-Audience Face Image Project - This is a jingly owned web page from the Computer Vision lab at the Open University of Israel (OUI) and Adience on analysis of face images. thier aim is to, "provide open research resources reflecting the challenges of modern face recognition and processing tasks."

 

  • Miami University Deception Detection Database (MU3D) this is free resource from Lloyd, Deska, Hugenberg, McConnell, Humphrey, & Kunstman contianing 320 videos of Black and White male and femal targets telling truths and lies.

    • you will need sign a usage agreement (set web page) and you can read about the database here (please CITE):


  • NimStim Face Stimulus Set. See: Ellertsen, K., Marcus, D.J., Nelson, C.A. (April, 2002). Categorization of Facial Expressions in Children and Adults: Establishing a Larger Stimulus Set. Poster presented at the Cognitive Neuroscience Society annual meeting, San Francisco).

 

  • Face Research Lab London Set. Lisa DeBruine Benedict Jones. This is a set of neutral and smiling faces (102 adults: 1350x1350 pixels in full colour. Template files) to be used with Psychomorph or WebMorph.org.The files contain “self-reported age, gender and ethnicity” and attractiveness ratings are included in the file london_faces_info.csv. Attractiveness ratings (on a 1-7 scale) from 2513 raters (ages 17-90)

 

  • InterFace This is a free, downloadable, standalone (Matlab based application (you don't need Matlab to run it) that will allow you to do all sorts of manipulation face stimuli. The downloadable package included a software guide, example images, and a Matlab converter if you don't have Matlab.

    • You will need to cite: Kramer, R. S. S., Jenkins, R., & Burton, A. M. (2016). InterFace: A software package for face image warping, averaging, and principal components analysis. Behavior Research Methods. Advance online publication. (a copy of the paper accompanies the download).

 

  • The Park Aging Mind Laboratory from the Aging and mind lab at UT Dallas and the Centre for Vital longevity. Here is a very useful (multi expression) FACE DATABASE there is also a CULTURAL NORMS database and Other (OBJECT/SCENE) stimuli

    • You will need to cite:

      • FACES: Minear, M. & Park, D.C.(2004). A lifespan database of adult facial stimuli. Behavior Research Methods, Instruments, & Computers. 36, 630-633.

      • NORMS: Yoon, C., Feinberg., Hu, P., Gutchess, A.H., Hedden, T., Chen, H., Jing, Q., Cui, Y., & Park, D.C. (2004). Category norms as a function of culture and age: Comparisons of item responses to 105 categories by American and Chinese adults. Psychology and Aging, 19(3), 379-393.

 

Infant/ child face data bases

  • LIRIS Children Spontaneous Facial Expression Video Database (LIRIS-CSE)

    This “database contains spontaneous / natural facial expressions of  12 children in diverse settings with variable recording scenarios showing six universal or prototypic emotional expressions (happiness, sadness, anger, surprise, disgust and fear). Children are recorded in constraint free environment (no restriction on head movement, no restriction on hands movement, free sitting setting, no restriction of any sort) while they watched specially built / selected stimuli. This constraint free environment allowed us to record spontaneous / natural expression of children as they occur. The database has been validated by 22 human raters”

    • To request database download (for research purpose only) visit the project webpage

    • A reprint of the paper can be found here: A novel database of Children's Spontaneous Facial Expressions (LIRIS-CSE). Rizwan Ahmed Khan, Crenn Arthur, Alexandre Meyer, Saida Bouakaz. arXiv (2018) preprint, arXiv:1812.01555.

Database collections

 Other

  • The Makeup Induced Face Spoofing (MIFS) dataset.

    • This data set comprises 107 facesof makeup-transformations. the clips are taken form randomly selected make-up tutorials found on YouTube video. You will need to ask for permission to use the data set (details on web pages)

    • You will need to cite:

      • A. Dantcheva, C. Chen, A. Ross, "Can Facial Cosmetics Affect the Matching Accuracy of Face Recognition Systems?," Proc. of 5th IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), (Washington DC, USA), September 2012.

      • C. Chen, A. Dantcheva, A. Ross, "Automatic Facial Makeup Detection with Application in Face Recognition," Proc. of 6th IAPR International Conference on Biometrics (ICB), (Madrid, Spain), June 2013.

      • C. Chen, A. Dantcheva, A. Ross, "An Ensemble of Patch-based Subspaces for Makeup-Robust Face Recognition," Information Fusion Journal, Vol. 32, pp. 80 - 92, November 2016.

      • C. Chen, A. Dantcheva, T. Swearingen, A. Ross, "Spoofing Faces Using Makeup: An Investigative Study," Proc. of 3rd IEEE International Conference on Identity, Security and Behavior Analysis (ISBA), (New Delhi, India), February 2017.

 

  • The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)

    • Courtesy of Livingstone & Russo

      • The RAVDESS (open access database) contains 7356 files. Each file was rated 10 times on emotional validity, intensity, and genuineness. Ratings were provided by 247 individuals who were characteristic of untrained adult research participants from North America. A further set of 72 participants provided test-retest data. High levels of emotional validity, interrater reliability, and test-retest intrarater reliability were reported.

    • The construction and validation of the RAVDESS is described in:

CITE: Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. https://doi.org/10.1371/journal.pone.0196391

 Some collections:

  • Lots of face stimlui here:

    • The Face stimulus and tool collection : this is an 'on-going project' by Ryan Stolier. The site contains links to face stimuli sets and face stimuli tools.

      and

    • Open Science Club a collection of tools and stimuli from a group of Chinese researchers on OSF

More face resources can be found here

  •  Face stimulus and tool collection  - Including databases, tools and other things, as compiled by Ryan M. Stolier  at NYU Psychology

  •  Psychomorph (Tiddeman and Perrett, 2001) - Psychomorph is a very handy way of manipulating face images, for use in experiments as stimuli. It runs in Java script. A very handy user guide by Clare Sutherland can be found here.

  • WebMorph (@webmorph_org - Follow) WebMorph is an online app for transforming (morphing, transforming, and morphometric techniques) face images for research. It was developed by the ever excellent Lisa DeBruine (@LisaDeBruine) of Glasgow University (and Bernie Tiddeman). You will need to create an account to use it. Lots of resources there.

    • A handy manual can be found here.

Other Visual Stimuli

  • Duñabeitia et al 's MultiPic: A standardized set of 750 drawings with norms for six European languages

    • A set of 750 colored pictures of concrete concepts/object. The data base has been normed acorss 6 European languages (Spanish, British English, Dutch, Italian, French, & German.   

      •  

        The paper can be found here (Duñabeitia, J. A., Crepaldi, D., Meyer, A. S., New, B., Pliatsikas, C., Smolka, E., & Brysbaert, M. (2017). MultiPic: A standardized set of 750 drawings with norms for six European languages. The Quarterly Journal of Experimental Psychology, (just-accepted), 1-24.

        ... and the figures found here.