Faq

1) The environment is mapped 2) Inventory areas are defined through the web GUI 3) The desired number of inventories are scheduled in the web GUI 4) The retail robot will autonomously do the inventories according to the configuration set

Knowing your products’ location gives you an exhaustive knowledge of your store’s performance and gives you the information you need to improve it. With the retail robot’s data, you can daily: - Detect misplaced products - Detect out-of-stock situations - Analyse your products’ distribution effectiveness - Take data-driven decisions - Check the end-caps distribution (with the autonomous mobile robot that uses vision cameras) - Enable planogram checking (with the retail robot that uses vision cameras) …And more! Contact us and we will assess how to make the most of your store!

Sure! StockBot was born to adapt to all that happens inside of a store. Taking inventory while the store is open, with people around, is how our retail robot works most of the time. Its advanced navigation system enables the robot to detect any person around, or to update its map if the furniture distribution is modified. Get in touch with us to learn more.

StockBot is set up in your facilities in a single day, when our team maps the store area and integrates the robot into your store’s inventory system. After that, you can easily define the areas of interest and forbidden areas in retail robot’s intuitive web GUI. As simple as this!

StockBot goes to recharge in its dock station autonomously after completing an inventory. The retail robot can run up to 12 hours continuously before going to the dock station, and completely recharges in 4 hours.

It locates products with a precision of 0.5 m.

We integrate the data generated by the robot into the ERP of the customer, adapting to any API provided.

The RFID is the following: EPC global UHF Class 1 Gen 2 / ISO 18000-6C.

ARan is an extremely versatile platform for research as it features ROS, the well established Open Source Robotics middleware, that provides access to all the hardware through its APIs and allows users to build on top of the existing repositories or to deploy their own software. In terms of hardware, ARan includes two motor-wheels with encoders, a LiDAR, two RGBD cameras, an IMU and both a standard computer and an NVIDIA GPU to make the most out of the robot. In terms of software, ARan is suitable for non-expert users as it offers an intuitive Visual Programing framework, powered by Advanced Navigation and Logistics Tasks Planner building blocks, as well as for expert roboticists thanks to ROS and its rich set of development tools and the widespread research community that supports it.

Yes. ARan is provided with mechanical mounting points and a user panel including power supply and communication ports (Ethernet, USB, CAN, GPIOs) to easily integrate your own devices or equipment on top of the robot. Furthermore, if a more comprehensive customisation is required, PAL Robotics’ engineering team is on hand to gather the requirements and work on this. Contact us to discover how to customise your ARan.

ARan is equipped with a wide Field of View LiDAR (245º) and 2x RGBD cameras that provide rich data and both CPU and GPU computation power to handle this type of environment. In addition to this, ARan is equipped with a speaker and LED stripes to synthesise voice or emit auditive and luminous signals that help the robot to find its way in environments with people.

The sky’s the limit. A robotic arm can be integrated using the mechanical mounting points exposed on ARan and the power supply and communication ports exposed on ARan’s user panel. You decide how both the AMR and the arm coordinate and make the most out of each other. Contact us to discover more about the possibilities of customisation for ARan.

Yes, the bottom part of the robot has been designed in order to isolate the onboard electronics from traces of water on the ground.

Of course. Thanks to ARan’s off-of-the-shelf wireless connectivity options, i.e. WiFi 6 and Bluetooth 5.0, or with other wireless protocols by easily connecting additional emitters.

Yes, one of the RGBD cameras is pointing towards the floor to account for these specific safety goals.

If you are an expert user you will probably use C++ or Python and will use the ROS API of the robot or link against its libraries. You may also want to use ARan’s RESTful API. If you require an easier entry-level you can use the web-based Visual Programming framework that enables you to quickly define complex behaviours and tasks.

After the calibrating the odometry of the wheels the following residual errors are obtained: Absolute error - % error 1 m forward motion - 0.00513854 m 0.51% 360° rotation - 0.548° 0.15%

The max traversable step is about 1 cm.

The maximum slope is about 6% and depends on the floor type and payload on the robot.

TIAGo Base has an industry certified version complying with EU regulations, i.e. CE Marking, according to the Machinery Directive 2006/42/EC, Low Voltage Directive 2014/35/UE and EMC Directive 2014/30/UE. The robot complies with the following standards: EN ISO 12100:2010. Safety of machinery — General principles for design — Risk assessment and risk reduction EN ISO 13849-1:2015 Safety of machinery — Safety-related parts of control systems — Part 1: General principles for design EN 1037:1995+A1:2008 Safety of machinery. Prevention of unexpected start-up EN ISO 13482: 2014 Robots and robotic devices — Safety requirements for personal care robots ISO/TS 15066:2016. Robots and Robotics devices – Collaborative Robots IEC 62133: Secondary cells and batteries containing alkaline or other non-acid electrolytes – Safety requirements for portable sealed secondary lithium cells, and for batteries made from them, for use in portable applications – Part 2: Lithium systems And is being tested by a certified laboratory according to the following norms: EN 61000-6-2:2017 Electromagnetic compatibility (EMC) — Part 6-2: Generic standards – Immunity for industrial environments. EN 61000-6-4:2017 Electromagnetic compatibility (EMC) — Part 6-4: Generic standards – Emission standard for industrial environments. EN 60204‐1:2006/A1:2009 Safety of machinery ‐ Electrical equipment of machines ‐ Part 1: General requirements

The maximum speed is 1 m/s.

The payload of TIAGo Base is 70 kg.

You can use Wi-Fi or any other wireless technology that you need. Contact us for further details!

You can leave your space as it is, the TIAGo Base doesn’t need any guide, track, or requirements to autonomously navigate and perform the deliveries. TIAGo Base is an Autonomous Mobile Robot (AMR) and uses an advanced set of sensors and software to safely navigate without following any guide. This is a huge step ahead from the traditional AGVs. TIAGo Base’s Artificial Intelligence generates the most efficient route to reach its delivery point, which is flexible and can change according to the production/business needs. People and other mobile elements, can safely move around it and even cross its way.

Yes, you can program TIAGo Base to work the hours your business needs. Once it fulfills all the tasks assigned, TIAGo Base will go back to its dock station and recharge while waiting for a new command. If the mobile robot runs out of battery in the middle of a task, our Fleet Control Software can autonomously send another TIAGo Base to take it over, ensuring a constant production flow. Tell us how many deliveries you need in your production line, your warehouse, or your company, and we can give you accurate details on that!

Sure! You can even combine diverse configurations of TIAGo Bases within your fleet according to your needs. We provide a Fleet Control Software to assign tasks, optimize routes, perform traffic control and enable analytics. Ask us for more details or if you have any special request!

You can choose the option that suits your needs best: Use our intuitive Web GUI, compatible with multiple devices (tablet, computer, mobile, etc). Integrate our AMRs with your ERP or internal management system. We can provide you or your integrator a RESTful API for a straightforward integration. And you’re ready to go!

Sure! Contact us and we will design a proof of concept so you can first test the mobile robots in your own home. See for yourself the advantages of having TIAGo Bases that boost your productivity!

A RESTful API is provided to integrators and end-users to handily integrate our AMRs with your ERP or business’ internal system. This way, you can command your fleet of TIAGo Bases from the same software you use to have everything under control. If you have any further questions about logistics automation and software integration, contact us!

At PAL Robotics we mostly use C++ and Python. As the REEM-C OS is a standard Ubuntu LTS, our customers can use other programming languages for their bipedal robot.

The REEM-C research robot can speak over 30 languages.

Yes, Kangaroo incorporates custom linear actuators with integrated force sensors providing a wide range of motion as well as high speed and torques at the joints.

frequently asked questions

TALOS’ Torque control has a low latency of 2kHz.

We provide a state estimation from IMU+kinematics out-of-the-box, a system for fusing this odometry with additional external measurements should be tackled with an additional estimator out of the real-time ros_control loop. This would be safer for the whole body torque control that relies on the CoM estimation.

Via ros_control, you have access to the encoders, IMU, torque sensors at a joint level, Force/Torque sensors on the wrists and ankles and temperature sensors of the robots.

Yes, the URDF includes CoM position, mass and inertia tensors. This model has been extracted from the CAD and it was validated by the full body inverse dynamics and torque control on the walking biped robot.

Expressive, helpful and entertaining, ARI helps you design the social interactions you are aiming for. Our collaborative robot has been created as a safe, interactive and autonomous social robot that can help you with customer-facing activities. As a receptionist, guide, entertainer or presenter; it connects with your customers.

Our social robot is very easy to deploy. It has an intuitive user interface that helps you easily adapt the collaborative robot to your needs. If you want to know more about how to use the robot in your environment, we are happy to hear more about your needs and help you along the way. You can reach out to our team via the contact form and we will get back to you within at most one business day.

ARI is easily integrated with external systems. The social robot has WiFi capabilities and could connect to your facility network infrastructure and internet based cloud services. It has a powerful and extensive API that integrators and end-users can take advantage of in order to connect the collaborative robot with thousands of different services.

ARI adapts to most environments, so you won’t need to make any modifications in the majority of the cases. The social humanoid robot uses advanced technology based on multiple sensors that allows the robot to safely interact and navigate without requiring any modification such as guiding tracks or other complex installation that other platforms might need.

ARI is able to speak in more than 30 languages and this is just the start for our social robotics. If you have a need for a specific language, please let us know and we can confirm if our social robot is able to speak that language.

An intuitive and user-friendly web-based interface allows you to easily alter ARI’s touchscreen content, speech, and even create your own gestures. This way, we ensure that the applications and behaviour of our social robot can be easily controlled from any device such as desktop, smartphone or tablet.

ARI’s motions, speeches and touchscreen contents are easy to adapt by using the provided SDK to match the needs of your application. ARI’s eyes are also customizable for enabling better interactions. Multiple themes are available for the touchscreen and the brightness is adaptable to different light conditions. In case you need a customization for a larger volume of units such as a special colour, please let us know and we will be happy to find a way to help you with the desired customization for your collaborative robot.

The easiest way is to get in touch with our team through our contact form and we will get back to you shortly.

The “Web-based Visual Programming ” Premium Package is an advanced feature that allows users to program and customise the TIAGo OMNI Base’s tasks and behaviours without the need for traditional coding. Through an intuitive web interface, users can drag and drop predefined blocks to create complex workflows, making it easier to tailor the robot’s functions to specific needs, enhancing efficiency and user experience. Contact us to discover more about our premium software packages.

High traction ensures reliable operation across different surfaces, maintaining performance in various environmental conditions.

The robot is designed to be quickly set up and managed through the user-friendly webGUI and Visual Programming package for non-technical expert users.

TIAGo OMNI Base improves efficiency, reduces operational costs, and enhances manoeuvrability in intralogistics. Get in touch with us to see how we are going to streamline and automate your logistics and warehouse.

Comprehensive training and online technical support are provided to ensure smooth operation and integration. Contact our team of experts to discover more on our assistance package.

TIAGo OMNI Base uses LiDARs and cameras for real-time obstacle detection and navigation.

Yes, its precise navigation and hygiene-friendly design make it suitable for sensitive environments like healthcare and labs.

Yes, its precise navigation and hygiene-friendly design make it suitable for sensitive environments like healthcare and labs.

The software provides enhanced mapping, localisation, and navigation capabilities (Virtual Obstacles, Points and Zones of Interest, Highways and Ramps support), utilising the robot’s sensors to their fullest potential.

The basic package includes the robot with standard sensors and software for autonomous navigation all 100% ROS based. The simulation of the robot in Gazebo is also provided along with its digital twin in Rviz. Contact us to discover more!

TIAGo OMNI Base excels in industrial, healthcare, hospitality, and office environments due to its versatility and adaptability. Get in touch with us to discover how our AMR adapts to your business and operational needs.

With high traction and omnidirectional wheels, TIAGo OMNI Base can traverse uneven surfaces without compromising stability.

TIAGo OMNI Base is equipped to navigate through changing environments and adapt to obstacles in real-time thanks to its 360º field of view obtained by its two lasers and optional RGBD camera.

TIAGo OMNI Base is ideal for navigational research thanks to the omnidirectional movement, payload, computational capacities, its 100% ROS integration and advanced sensors, allowing for the developing and testing of navigation algorithms.

Yes, TIAGo OMNI Base can be easily integrated with existing ERP systems for streamlined operations. Reach out to us to learn more!

The camera add-on enhances spatial awareness, improving object recognition and navigation.

TIAGo OMNI Base offers various accessories and the robot can be customised to suit your operational and business requirements and needs. Contact us to get in touch with our team of experts that will help you discover all the integration opportunities of TIAGo OMNI Base.

The omnidirectional movement allows the robot to navigate tight spaces by minimising the number of manoeuvres, optimising logistics workflos and minimising incidents.

You can watch videos of TIAGo in action through the following YouTube TIAGo Playlist. In there you will find our mobile manipulator robot used for many research projects, robotics competitions or showing specific features. Enjoy!

The different motors of the mobile manipulator robot can be controlled using ROS interfaces. All the controllers provided are implemented as plugins of ros_control and are contained in ros_controllers. All the controllers run in the real-time control loop and have access to the full robot hardware interfaces exposed by ros_control. The motors of TIAGo can be controlled in the following modes: The wheels of the mobile base can be controlled through velocity mode. The motor of the lifting torso and the two motors of the head can be controlled in position mode. Finally, the motors of the arm can be controlled in position and in effort mode.

Yes. The user only needs to create ros_control plugins to implement new controllers and add them to the ros_controllers or replace the desired ones. The new controllers will run in the real-time control loop and will have access to the full mobile manipulator robot hardware interfaces exposed by ros_control.

Yes. Compliance in TIAGo’s arm is achieved by using two mechanisms: Sensorless torque control: open-loop torque references are commanded. This is based on feedforward current control which is achieved thanks to the accurate model of the arm dynamics including mechanics, electronics cabling and covers and the low friction of the joints. This results in a high acceptable force control in the operational space. Admittance control: using the data from the force/torque sensor of the wrist, positions can be commanded taking external forces into account obtaining even much better results than when using sensorless torque control.

Yes. They use the same mounting connector and the same interface on the mobile manipulator robot. The process for interchanging the end effectors in TIAGo is very easy and fast.

The mobile manipulator robot has 1 year warranty on hardware. The warranty for your TIAGo can be extended according to your need.