This project explores how gaze, speech and sound, small movements, gesture, and proximity to the user affects the way that we perceive and work with social robots. We are testing these design primitives on everything from simple wheeled robots to humanoid systems.