Spoken labels facilitate visual categorization for 3-month old infants while sine-wave tones do not. The early presence of facilitatory effects for labels on categorization has been suggested to reflect an evolutionarily pre-determined link between speech-sounds and concepts. Yet, parents provide coordinated, intersensory structure to 3-month old infants when labeling objects that may play a role in the development of this early speech-concept link. In particular, parents present labels in synchrony with object motion, which has been suggested to influence infants' attention to relevant information during these multimodal labeling events. Previously we demonstrated that experience with audio-visual synchrony can lead a non-speech signal - sine-wave tones – to facilitate categorization, suggesting that parents' use of multimodal synchrony plays a role in the link between speech and concepts. Alternatively, multimodal synchrony may underlie the development of a link between signals other than speech (e.g., sign language) and concepts, but not speech-concept links per se. To test whether multimodal synchrony plays a role in the link between speech and concepts, we ran an experiment in which infants were presented with labels in synchrony with object motion prior to a categorization task. If synchronous pre-training boosts the effect words have on categorization, this would support the hypothesis that parents' use of synchrony plays a role in the development of speech-concept links.