Emotion Recognition
Emotion Recognition tests whether participants can identify the emotion expressed in facial images. Researchers upload up to 50 images and label each with one of five emotions. Participants are shown a random subset of 10 images and must select the matching emotion.
ActivitySpec: lamp.emotion_recognition
Cognitive domain: Social cognition
Configurationโ
Researchers must provide a dataset of up to 50 facial images and label each with the corresponding emotion. Available emotions: happiness, sadness, fear, anger, and neutral.
If fewer than 10 images are uploaded, all images will be shown in random order.
Sample Instructionsโ
"In this task you will be presented with up to 10 facial images and asked to identify the emotion expressed by each one from a list of 5 emotions."
Usageโ
The participant sees one image at a time, selects the emotion they believe matches, and taps "Save" to proceed. The assessment cycles through up to 10 randomly selected images.
Scoringโ
Scoring is based on whether the user correctly identifies the emotion expressed in each image.
Screenshotsโ


Dataโ
temporal_slicesโ
| Field | Description |
|---|---|
item | Question number |
value | Selected emotion: "Happiness", "Sadness", "Fear", "Anger", "Neutral" |
type | true = correct, false = incorrect |
duration | Response time (seconds) |
level | Prompt text |
Cortex Featuresโ
No Cortex features currently process Emotion Recognition data.
View in Portal | Python SDK | API Reference