Our Robots

meka

Meka

Meka (made by Meka Robotics, now Google) joined GMU in 2014 and is the oldest robot in the lab. In fact, she is a humanoid robot head with seven degrees of freedom, high-resolution FireWire cameras in each eye, zero backlash Harmonic Drive gearing in the neck, and a ton of human movements and postures. She weighs 7.6kg (16.7lbs), has a pair of luminous, waggling doggy-like ears and can be attached to a humanoid torso. Her appearance is customized to match a Japanese anime character. Meka can process and react to eye- and body movements from human interaction partners. In the lab, Meka is used to examine mechanisms of joint attention in realistic human-robot interactions.



cozmo

Cozmo

12 Cozmos (made by ANKI) joined the lab in 2016. Cozmo is a toy robot with a huge personality that evolves over time. He explores the environment alone or guided by human interaction partners. Cozmo can express a wealth of human-like emotions in response to his own actions or to his interaction partner’s input. Cozmo comes with three cubes that he is very keen about and that he can use to play games with his fellow humans. All of Cozmo’s functionalities can be accessed and modified via a mobile app. The SDK is freely accessible, which makes it easy to program new behaviors in Python. Cozmo is used for studies on reward processing, reinforcement learning and social bonding.



Vector

Vector

Vector (made by ANKI) joined the lab in 2018; he is Cozmo’s big brother (literally). Vector is designed to be a robot companion that lifts your mood and can do (simple) daily chores for you. He runs a sophisticated AI model and explores the environment completely autonomously. Vector can read the room, predict the weather, set a timer, take snapshots and make phone calls for you. Like Cozmo, Vector can easily be controlled via a mobile app and develops more advanced skills over time via interacting with his human friends. His SDK is also freely available and he can be programmed in Python. Vector is used for studies on trust in automation and trust calibration in human-robot teams.



Furhat

Furhat

Furhat (made by Furhat Robotics) joined the lab in Fall 2017. It is a customizable back projection robot head built by Furhat Robotics. Furhat is a social robot that communicates with us humans as we do with each other - by speaking, listening, showing emotions and maintaining eye contact. It comes pre-built with a variety of voices, faces, and expressions that can be modified, adjusted and added to. Furhat can engage in basic conversations using Google’s natural language processor. Furhat is used for studies on mind perception and the uncanny valley.








nao

Nao

2 NAOs (made by SoftBank Robotics) joined the lab in Fall 2017. It is an infant-sized humanoid robot being used in research and education worldwide. It is equipped with tactile sensors, ultrasonic sensors, a gyro, an accelerometer, force sensors, infrared sensors, 2 HD cameras, 4 microphones and high accuracy digital encoders on each joint. Interactions with NAO can be simulated using the custom made software Choregraphe, the simulation software Webots and the SDK simulator. NAO’s strengths are his sophisticated input and output devices that allow him to move in a very human-like fashion, perceive the environment at a high level of detail and interact intuitively with human interaction partners. The NAOs are used for studies on mind perception and entrainment in human-robot interaction.



Peachybots

Maki

8 Maki robots (made by Hellorobo) joined the lab in Spring 2019. Maki is an open-source, emotive robot capable of interacting with and adjusting to humans. Maki is a 3D-printable, low-cost, customizable robot platform that is still socially expressive and adaptive. Maki can attune to social cues sent by human interaction partners in real-time using cameras and microphones and can engage in basic verbal conversations using Alexa dot. The Makis will be used to examine the effect of customization on motivation as well as the effect mind perception on social cognitive processes in long-term human-robot interactions.