ARI is a high-performance robotic platform designed for a wide range of multimodal expressive gestures and behaviours, making it the ideal social robot and suitable for human-robot interaction, perception, cognition and navigation, especially thanks to its touchscreen, gaze control and versatile gestures. With a combination of Intel i7 and NVIDIA Jetson TX2 GPU, you will have enough processing power available for AI development.

For technical questions regarding the public simulation of the robot please write to

If you wish to know more or request a quote, please send us a message or refer to the product microsite


A comprehensive set of tutorials are now available for the ARI robot in Gazebo simulator and currently contains tutorials on how to use OpenCV, autonomous navigation, MoveIt! and Human Robot Interaction.

Public simulation packages overview

This section presents an overview of the packages used in the public simulation of ARI, with links to the corresponding wiki pages describing the packages. For the installation instructions of the packages please refer to ARI simulation installation tutorial.

A picture is worth a thousand words: ARI Gallery