Data and description

ChaLearn Looking at People Workshop on Automatic Personality Analysis and 
First Impressions Challenge @ ECCV2016

First impressions challenge

Description & data

As part of the speeds interviews project, we are organizing a challenge on “first impressions”, in which participants will develop solutions for recognizing personality traits of users in short video sequences. We are making available a large newly collected data set sponsored by Microsoft of at least 10,000 15-second videos collected from YouTube, annotated with personality traits by AMT workers. 

Sample 1


Sample 2


Sample 3

Sample 4

Table 1. Sample videos that will be used for the first impression challenge.

The traits to be recognized will correspond to the “big five” personality traits used in psychology and well known of hiring managers using standardized personality profiling: Extroversion, Agreeableness, Conscientiousness, Neuroticism,  and Openness to experience.

As is known, the first impression made is highly important in many contexts, such as human resourcing or job interviews. This work could become very relevant to training young people to present themselves better by changing their behavior in simple ways. The participants who obtain the best results in the challenge will be invited to submit a paper to the workshop.

The challenge is part of the Speed Interviews project, you can know more about this project by entering the corresponding section, additionally, the following video provides a comprehensive overview of the project:

Speed interviews project


As previous challenges organized by Chalearn, the first impressions challenge will be based on the CodaLab platform, information about registration and participation can be found in the competition site:

 In broad terms the challenge will proceed as follows:

  1. Participants register to the challenge. 
  2. Development (labeled) and validation (unlabeled) data sets are made available to registered participants. 
  3. Participants develop their trait recognition methods using development and validation data.  
  4. Validation labels are released, participants can tune their methods and use development+validation data to train their final models.
  5. Final evaluation (test) data are released, participants make predictions for test samples and submit them via CodaLab.
  6. Participants submit fact sheets describing their methods.
  7. Organizers start the verification process and notify the final results. 
  8. Top ranked participants with verified codes are eligible for prizes. 
  9. Top ranked participants are encouraged to submit a paper to the associated workshop. 
  10. Prizes are awarded during the ECCV workshop at Amsterdam. 


Participants will be evaluated by a measure associated to the number of correct traits being recognized in the test set. More information to be added soon. 


Top three ranked participants of the challenge will receive a prize as follows (estimated, it depends on the available budget):

  • First place:      1500USD + 400USD travel grant + award certificate 
  • Second place:  1000USD + 400USD travel grant + award certificate 
  • Thrid- place:      500USD + 400USD travel grant + award certificate

in addition top ranked participants will be invited to follow the workshop submission guide for inclusion of a description of their system at the ECCV 2016 workshop proceedings, participants of the workshop or challenge will be invited to submit revised and extended versions of their papers to be considered for publication in a special issue on personality analysis in the IEEE Transactions on Affecfive Computing. 

Important dates

  • May, 15th, 2016: Beginning of the quantitative competition, release of development (with labels) and validation data (without labels).
  • June 30th, 2016: Release of encrypted final evaluation data (without labels) and validation labels. Participants can start training their methods with the whole data set.
  • July 3rd, 2016: Release of final evaluation data decryption key. Participants start predicting the results on the final evaluation data.
  • July 13th, 2016: End of the quantitative competition. Deadline for submitting the predictions over the final evaluation data. Deadline for code submission. The organizers start the code verification by running it on the final evaluation data. 
  • July 15th, 2016: Deadline for submitting the fact sheets.
  • July 15th, 2016: Release of the verification results to the participants for review. Top ranked participants are invited to follow the workshop submission guide for inclusion at ECCV 2016 ChaLearn Looking at People workshop proceedings.