Joint Contest on Multimedia Challenges Beyond Visual Analysis @ICPR2016

Workshop and Challenge on Multimedia Challenges

Cancún, México
December, 2016 (exact day TBA)

This page provides additional information about the data and evaluation metrics for the four tracks of the Joint Contest on Multimedia Challenges Beyond Visual Analysis  @ ICPR16

Track 1: First impressions

As part of the speeds interviews project, we are organizing a challenge on “first impressions”, in which participants will develop solutions for recognizing personality traits of users in short video sequences. We are making available a large newly collected data set sponsored by Microsoft of at least 10,000 15-second videos collected from YouTube, annotated with personality traits by AMT workers. A first round of the first impressions challenge was run as part of the ECCV16 workshop program. In this second stage of the challenge participants can further improve their solutions and collaborate with other participants, through the coopetition scheme: in which participants are motivated to collaborate for succeeding in the competition.

Sample 1


Sample 2


Sample 3

Sample 4

Table 1. Sample videos that will be used for the first impression challenge.

The traits to be recognized will correspond to the “big five” personality traits used in psychology and well known of hiring managers using standardized personality profiling: Extroversion, Agreeableness, Conscientiousness, Neuroticism,  and Openness to experience.

Evaluation of track 1.    Participants will be evaluated by a measure associated to the number of correct traits being recognized in the test set. More specifically, the evaluation consists of computing the mean accuracy among the video tested for the 5 traits between the predicted continuous values and the continuous ground truth values.

Track 2: Isolated gesture recognition track

We are organizing a track on isolated gesture recognition from RGB-D data, where the goal is developing methods for recognizing the category of human gestures from segmented RGB-D video. A new data set called ChaLearn LAP RGB-D Isolated Gesture Dataset (IsoGD) is considered for this track (sample images taken from this data set are shown in Figure 1).  This database includes 47,933 RGB-D gesture videos (about 9G). Each RGB-D video depicts a single  gesture and there are 249 gesture categories performed by 21 different individuals. Performance of methods will be judged by its recognition performance.


As previous challenges organized by Chalearn, all the four tracks of the contest will be based on the CodaLab platform, information about registration and participation can be found in the CodaLab's tracks' sites:

 In broad terms the challenge will proceed as follows (track 1 will adhere to slightly different procedure):

  1. Participants register to the challenge. 
  2. Development (labeled) and validation (unlabeled) data sets are made available to registered participants. 
  3. Participants develop their  methods using development and validation data.  
  4. Validation labels are released, participants can tune their methods and use development+validation data to train their final models. ** Please note that for track 1, validation labels will not be released. **
  5. Final evaluation (test) data are released, participants make predictions for test samples and submit them via CodaLab.
  6. Participants submit fact sheets describing their methods.
  7. Organizers start the verification process and notify the final results. 
  8. Top ranked participants with verified codes are eligible for awards. 
  9. Top ranked participants are encouraged to submit a paper to the associated ICPR workshop. 
  10. Winning certificates are awarded during the ICPR workshop at Cancun, Mexico. 

Important dates

Important dates (quantitative challenge):

  • 30th June, 2016: Beginning of the quantitative competition, release of development (with labels) and validation data (without labels).
  • 8th August, 2016: Release of encrypted final evaluation data (without labels) and validation labels. Participants can start training their methods with the whole data set.
  • 10th August, 2016: Paper submission deadline for no participants.
  • 12th August, 2016: Release of final evaluation data decryption key. Participants start predicting the results on the final evaluation data.
  • 16th August, 2016: End of the quantitative competition. Deadline for submission of predictions on the final evaluation data. Deadline for code submission. The organizers start the code verification by running it on the final evaluation data
  • 17th August, 2016: Deadline for submitting the fact sheets.
  • 20th August, 2016: Release of the verification results to the participants for review. Participants are invited to follow the paper submission guide for submitting contest papers.
  • 25th August, 2016: Paper submission deadline for participants.
  • 2th September, 2016: Notification of paper acceptance.
  • 5th September, 2016: Camera ready of contest papers.
  • December 2016: ICPR 2016 Joint Contest on Multimedia Challenges Beyond Visual Analysis, challenge results, award ceremony.