ROBOTICS

Autonomous Mobile Robots: not a distant future but a current reality created by the sharpest minds

Humans being intelligent, wanted to create a mechanical replica of themselves to handle monotonous and repetitive tasks, resulting in the genesis of Robots. Robotics has since then evolved leaps and bounds to where we are at today. In our journey at Ignitarium, we started with a simple Ackermann Steered Rover and then evolved to make this rover intelligent and feature rich. Today we have built expertise on the ROS, Sensor fusion, Path Planning, Navigation, Dynamic Obstacle detection & avoidance and Perception Engineering. The expertise on the Vision, writing low level software for edge devices and expertise in AI/ML has helped us in creating the right team to build our software offerings in Robotics.

Ignitarium Robotics Highlights

Sensor Fusion

Sensors are key components of any autonomous machine and each sensor has its unique advantage or works well in certain conditions. When you have the ability to combine inputs of various sensors in a system, to create unified processed data, the resulting model is more accurate and offers enhanced reliability. The most common sensors in the industry are Cameras, Lidars and Radars. 

For State Estimation

(Position & Velocity)

Odometry – Wheel Odometry, IMU & 2D Lidar

Visual Odometry – IMU & 3D VisionCamera

Velocity Estimation – Wheel Odometry, IMU& GPS

For Object Classification/

Detection/Avoidance

2D Lidar with 3D Vision Camera

3D Lidar with camera

Radar with camera

Path Planning & Navigation

Autonomous Mobile Robots (AMRs) should be capable of not only detecting the objects [Static and Dynamic] but also be intelligent enough to avoid the same by recalculating the optimized route between the current location and the destination. As you might be aware, path-planning requires a map of the environment and the robot to be aware of its location with respect to this map. Robots that are capable of Simultaneous localization and mapping (SLAM) can therefore use optimum coverage path planning approaches in order to achieve systematic coverage of the entire free space.

Path Planning

Self-localization – Where Am I ?

Path planning – How do I get to my

Destination

Map building and interpretation –

Geometric representation of Robots environment

SLAM Algorithms

Hector

G-Mapping

Cartographer

RTAB

ORB

Perception

Perception is one of the key aspects for a robot to take decisions, plan and operate in real-world environments. Some examples of robotic perception are obstacle detection, object recognition, semantic place classification, 3D environment representation, terrain classification, pedestrian & vehicle detection, and object tracking.  

Sensors: Camera; Lidar; Radar; RGBD

Sensor Data Processing: Mapping and Extraction of the datafrom the Sensors

AI/ML Inference: Data Analysis, Inference, Prediction

Outcome: Planning, Execution, Navigation

Video

Capabilities Demo Video

Robotics capabilities

One Software Stack Demo Video

One software stack, many applications

Case Studies

Autonomous Mobile Robots