FACE STIMULI (mostly)
In no particular order....
(please note that you will probably need to seek permission to use these resources)
But before that a bit of fun: 50 things that look like faces
Stirling database: This is a big database with lots of stimuli and links to other databases.
Glasgow Face Database: Burton et al.'s
Glasgow Face Matching Test (GFMT) and Glasgow Unfamiliar Face Database (GUFD) are now hosted in York in the York FaceVar Lab pages.
FAMED (Faces and Motion Exeter Database. This a database of 32 male actors in various formats (blurred, pixelated etc).
Face Image Meta-Database (fIMDb) - version 2 a collection of resources from Cill Workman (brain enthusiast). there’s alsorts of useful face stuff here. the link for teh data base goes toa database spreadsheet with lots of links from there.
NIST Color FERET from The National Institute of Standards and Technology (NIST): http://www.nist.gov/itl/iad/ig/colorferet.cfm.
Biometric recognition team faces (c/o The Institute of Control and Information Engineering).
Caucasian Faces - site and database was designed and maintained by Dr Libor Spacek at University of Essex.
FEI face database - a Brazilian face database by Dr. Carlos Eduardo Thomaz.
Center for Vital Longevity's Face Database - has been setup by Meredith Minear and Denise Park at The Center for Vital Longevity Face Database , University of Michigan). There’s quite a range of stimuli here including different races, emotions and orientations.
Labeled Faces in the Wild Home - [based at he Computer Science Department at the University of Massachusetts, this is, "a database of face photographs ] designed for studying the problem of unconstrained face recognition. The data set contains more than 13,000 images of faces collected from the web. Each face has been labeled with the name of the person pictured. 1680 of the people pictured have two or more distinct photos in the data set. The only constraint on these faces is that they were detected by the Viola-Jones face detector. More details can be found in the technical report below."
The Dartmouth Database of Child Faces contains images of 40 male and 40 female models between the ages of 6 and 16.
The Belfast Natural Induced Emotion Database - from Sneddon, McRorie, McKeown & Hanratty at Queens University Belfast. The database contains video recordings of mild/moderate emotional responses to lab based emotion inducing tasks (the files are quite big).
Ebner faces MAX Plank Institute: Face database includes older faces and emotional faces.
“FACES is a lifespan digital collection of adult emotional facial stimuli. It contains two sets of images of naturalistic faces of 171 women and men displaying each of 6 facial expressions: neutrality, sadness, disgust, fear, anger, and happiness. The specialty of this collection is that it comprises pictures of persons of three different age groups: young(n=58), middle-aged (n=56), and older (n=57)."
· Cite: Ebner, N., Riediger, M., & Lindenberger, U. (2010). FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation.Behavior research Methods, 42, 351-362. doi:10.3758/BRM.42.1.351
The MR2, is a multi-racial, mega-resolution database of facial stimuli, that contains 74 full-colour images of men and women of European, African, and East Asian descent.Reference:
· Cite: Strohminger, N., Gray, K., Chituc, V., Heffner, J., Schein, C., and Heagins, T.B. (in press). The MR2: A multi-racial mega-resolution database of facial stimuli. Behavior Research Methods.
Facial emotion database FEED – Technical University Munich FG-NET Database with Facial Expressions and Emotions.
The City Infant Face Database - A collection of babies faces (colour and BW - each 157 images) expressing emotions and associated ratings/validation data. You will need to request permission to access the images. Just follow the link above. All courtesy of Webb, Ayers, & Endress at City Univeirty
Cite: Webb, R., Ayers, S., & Endress, A. (2017). The City Infant Faces Database: A validated set of infant facial expressions. Behavior Research Methods. doi: 10.3758/s13428-017-0859-9
The OUI-Audience Face Image Project - This is a jingly owned web page from the Computer Vision lab at the Open University of Israel (OUI) and Adience on analysis of face images. thier aim is to, "provide open research resources reflecting the challenges of modern face recognition and processing tasks."
NimStim Face Stimulus Set. See: Ellertsen, K., Marcus, D.J., Nelson, C.A. (April, 2002). Categorization of Facial Expressions in Children and Adults: Establishing a Larger Stimulus Set. Poster presented at the Cognitive Neuroscience Society annual meeting, San Francisco).
Face Research Lab London Set. Lisa DeBruine Benedict Jones. This is a set of neutral and smiling faces (102 adults: 1350x1350 pixels in full colour. Template files) to be used with Psychomorph or WebMorph.org.The files contain “self-reported age, gender and ethnicity” and attractiveness ratings are included in the file london_faces_info.csv. Attractiveness ratings (on a 1-7 scale) from 2513 raters (ages 17-90)
InterFace This is a free, downloadable, standalone (Matlab based application (you don't need Matlab to run it) that will allow you to do all sorts of manipulation face stimuli. The downloadable package included a software guide, example images, and a Matlab converter if you don't have Matlab.
You will need to cite: Kramer, R. S. S., Jenkins, R., & Burton, A. M. (2016). InterFace: A software package for face image warping, averaging, and principal components analysis. Behavior Research Methods. Advance online publication. (a copy of the paper accompanies the download).
The Park Aging Mind Laboratory from the Aging and mind lab at UT Dallas and the Centre for Vital longevity. Here is a very useful (multi expression) FACE DATABASE there is also a CULTURAL NORMS database and Other (OBJECT/SCENE) stimuli
You will need to cite:
FACES: Minear, M. & Park, D.C.(2004). A lifespan database of adult facial stimuli. Behavior Research Methods, Instruments, & Computers. 36, 630-633.
NORMS: Yoon, C., Feinberg., Hu, P., Gutchess, A.H., Hedden, T., Chen, H., Jing, Q., Cui, Y., & Park, D.C. (2004). Category norms as a function of culture and age: Comparisons of item responses to 105 categories by American and Chinese adults. Psychology and Aging, 19(3), 379-393.
Created by Professor Shiguang Shan of Visual Information Processing and Learning (VIPL) group
There are currently 99,594 images of 1040 Chinese people (595 M; 445 F). The images vary for Pose, Expression, Accessory, and Lighting. And more bedsides ...
The Makeup Induced Face Spoofing (MIFS) dataset.
This data set comprises 107 facesof makeup-transformations. the clips are taken form randomly selected make-up tutorials found on YouTube video. You will need to ask for permission to use the data set (details on web pages)
You will need to cite:
A. Dantcheva, C. Chen, A. Ross, "Can Facial Cosmetics Affect the Matching Accuracy of Face Recognition Systems?," Proc. of 5th IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), (Washington DC, USA), September 2012.
C. Chen, A. Dantcheva, A. Ross, "Automatic Facial Makeup Detection with Application in Face Recognition," Proc. of 6th IAPR International Conference on Biometrics (ICB), (Madrid, Spain), June 2013.
C. Chen, A. Dantcheva, A. Ross, "An Ensemble of Patch-based Subspaces for Makeup-Robust Face Recognition," Information Fusion Journal, Vol. 32, pp. 80 - 92, November 2016.
C. Chen, A. Dantcheva, T. Swearingen, A. Ross, "Spoofing Faces Using Makeup: An Investigative Study," Proc. of 3rd IEEE International Conference on Identity, Security and Behavior Analysis (ISBA), (New Delhi, India), February 2017.
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)
Courtesy of Livingstone & Russo
The RAVDESS (open access database) contains 7356 files. Each file was rated 10 times on emotional validity, intensity, and genuineness. Ratings were provided by 247 individuals who were characteristic of untrained adult research participants from North America. A further set of 72 participants provided test-retest data. High levels of emotional validity, interrater reliability, and test-retest intrarater reliability were reported.
The construction and validation of the RAVDESS is described in:
CITE: Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. https://doi.org/10.1371/journal.pone.0196391.
Lots of face stimlui here:
More face resources can be found here
Face stimulus and tool collection - Including databases, tools and other things, as compiled by Ryan M. Stolier atNYU Psychology
Morphases: bit of fun really.
Other Visual Stimuli
Duñabeitia et al 's MultiPic: A standardized set of 750 drawings with norms for six European languages
A set of 750 colored pictures of concrete concepts/object. The data base has been normed acorss 6 European languages (Spanish, British English, Dutch, Italian, French, & German.
The paper can be found here (Duñabeitia, J. A., Crepaldi, D., Meyer, A. S., New, B., Pliatsikas, C., Smolka, E., & Brysbaert, M. (2017). MultiPic: A standardized set of 750 drawings with norms for six European languages. The Quarterly Journal of Experimental Psychology, (just-accepted), 1-24.)
... and the figures found here.