Robots that are able to move into areas where people cannot and that are able to collaboratively explore these environments by teaming with remote humans, have tremendous potential to make a difference in a number of tasks, such as search and rescue operations, reconnaissance, or mission planning. The goal of this project is to better understand the verbal and nonverbal components involved in multimodal dialogue interaction during collaborative exploration tasks.