Links to useful tools for CDS members. Click on each to learn more
Children's Social Understanding Scale
The Children’s Social Understanding Scale (CSUS) is a parent report measure designed to assess individual differences in children’s theories of mind. The scale includes assessments of belief, knowledge, perception, desire, intention, and emotion understanding, and has been well validated for children in the 3 to 6 age range (Tahiroglu, Moses, et al., Developmental Psychology, 2014). Both long form (42 item) and short form (18 items) versions are available for download here. For further information, please contact Lou Moses (email@example.com)
Databrary (http://databrary.org) is a digital data library for sharing video and related research data and metadata, designed especially for developmental scientists. Databrary is funded by awards from the U.S. National Science Foundation (NSF), BCS-1238599, and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), U01-HD-076595. Karen Adolph (NYU) and Rick Gilmore (Penn State) are Co-PIs. For more information about sharing your research data with Databrary, send an email to firstname.lastname@example.org
Datavyu (http://datavyu.org) is a complete software package for visualizing and coding behavioral observations from video data sources, developed by and for developmental scientists. Datavyu supports multiple data streams—video, audio, physiology, motion tracking, eye tracking—and links them together with a flexible, extensible coding spreadsheet that enables time-locked coding and visualization. Keyboard shortcuts and user-defined scripts let users navigate quickly and iteratively through data streams, adding comments, codes, and interpretations.
Datavyu is part of the Databrary (http://databrary.org) project, led by Karen Adolph (NYU) and Rick Gilmore (Penn State) and supported by the U.S. National Science Foundation (NSF) under BCS-1238599 and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) under U01-HD-076595.
Early Motor Questionnaire (EMQ)
EyetrackingR is an R-package for eye-tracking data. It requires minimal experience programming in R. It’s designed to make dealing with eye-tracking data easier, addressing tasks along the pipeline from raw data to analysis and visualization. It offers several popular types of analyses, including growth-curve analysis, onset-contingent reaction time analyses, as well as several non-parametric bootstrapping approaches. For installation instructions and tutorials visit http://www.eyetracking-r.com
Visit the Habit2 Website
Novel Object and Unusual Name (NOUN) Database
The NOUN Database comprises images of novel, unusual objects for experimental research. The database includes 64 primary stimuli and a collection of 10 novel categories, each including three exemplars. The catalog and images can be downloaded here: http://michaelhout.com/?page_id=759. The current edition and its validations were funded by the British Academy and the Leverhulme Trust (SF120032). For more information please contact Jessica Horst ( email@example.com) or Michael Hout (firstname.lastname@example.org).
Reference: *Horst, J. S., & Hout, M. C. (in press). The Novel Object and Unusual Name (NOUN) Database: A collection of novel images for use in experimental research. *Behavior Research Methods.
Numerus is a simple web-based application designed to create stimuli for numerical cognition experiments. It was developed by Koleen McCrink and Francis Kelly, using funds from NIH R15HD065629-01.
Using Numerus, you can make and export arrays of objects (rectangles, squares, triangles, and circles) in which variables such as area, perimeter, density, and item size are systematically manipulated by the user. You can also obtain and export the arrays’ specific stimuli dimensions in a .csv file.
To use Numerus, you must be running a Macintosh OS, have access to a web browser, and have the free Python programming language downloaded onto your computer (find it at python.org). No programming experience is needed to run Numerus. You can find instructions for installing and running Numerus inside the Numerusapp folder (Numerusapp —> Installation.pdf: SEE NOTE BELOW FOR UPDATES TO THESE INSTRUCTIONS), and instructions for creating stimuli in the Numerus Introduction video, also in the Numerusapp folder (Numerusapp –> bcdc –> numerus –> static –> video –> Numerus.Introduction.mv4).
**Important Note** Recent updates to Mac OS security have resulted in the program needing to be launched after download in a slightly different way than indicated in the instructions. To launch Numerus, instead of clicking on Startnumerus, you must right click on launch.py and open with the Python launcher manually. You can then select numerus.webloc to start the program.
Thank you for your interest, and we hope you find this program useful.
To cite Numerus in-text, use: McCrink (2014). The accompanying reference is: McCrink, K. (2014). Numerus. (Version 1.0) [Sofware]
Preferential Looking Coder (Pref Coder)
Pref Coder is a computer software designed to quickly code gaze direction of participants in looking paradigms. The software includes five pre-defined gaze locations (left, center, right, distracted, look both) that will fit most preferential looking, anticipation, or forced choice experimental designs. In addition, four touch locations can be specified (touch left, touch center, touch right, no touch). The sampling rate of Pref Coder is set 10 frames per second (one frame every 100 ms). Multiple trials can be coded for one video, and both summary and raw outputs are provided. Raw output files provide a frame-by-frame (every 100ms) annotation of the video using a binary (checked vs. unchecked) code. Pref Coder has been specifically designed to enable fast coding of infant behaviors. Fastest coding can be achieved by using keyboard shortcuts: Step forward or backward using right and left arrow keys, keys w, e, a, s, d for coding looking behaviors, and keys z, x, c, v for coding touching behavior. With questions or comments please email Klaus Libertus
The Child Affective Facial Expression (CAFE) Set
The Child Affective Facial Expression (CAFE) set is validated stimulus set of ~1200 photographs of preschool-aged (ages 2-8) children posing for 7 different emotional facial expressions—happy, angry, sad, fearful, surprise, neutral, and disgust. The set was created by Vanessa LoBue (Rutgers University) and Cat Thrasher (University of Virginia), and its validation was funded by the National Science Foundation’s Division of Behavioral and Cognitive Sciences (BCS-1247590). It is available downloading from http://databrary.org/. You will need to register at http://databrary.org/register to become an authorized researcher.
For more information about the CAFE set, please visit: http://childstudycenter.rutgers.edu/Child_Affective_Facial_Expression_Set.html
Wordbank (wordbank.stanford.edu) is an open, browsable database of children’s vocabulary development. It contains data for thousands of children from MacArthur-Bates Communicative Development Inventory forms (parent report questionnaires about early language), across a number of languages. For further information, please contact Michael C. Frank email@example.com