Paxini Gen3: From Sensing to Tactile-Enhanced Robot Learning

In ~8 hours, go from your first sensor reading to a trained robot manipulation policy that uses tactile feedback. You will build a complete grasp quality detection pipeline integrated with robot arm data collection — the same workflow used in contact-rich manipulation research.

Total Time ~8 hours
Difficulty Intermediate
Units 6 (+ Orientation)
Hardware Required Paxini Gen3 + any robot arm
Prerequisites Basic Python, a robot arm
Before you start this path: You should be comfortable writing basic Python scripts (reading data from a device, parsing arrays) and have a robot arm with a Python SDK you can control. You do not need prior robotics research experience. If you are new to robot data collection in general, complete the OpenArm learning path first — it covers data collection fundamentals in more depth.

What You Will Build

By the end of this path you will have:

Live Tactile Heatmap

A streaming pressure visualization running at 500 Hz from your Gen3 sensor, confirming every taxel is functional.

Grasp Quality Detector

An online classifier that distinguishes stable grasps from slip-prone ones in real time during robot operation.

Tactile Dataset (50 demos)

A full LeRobot-format dataset with synchronized tactile + joint + camera channels, quality-checked and ready for training.

Tactile-Aware Policy

A trained ACT or Diffusion Policy model that uses tactile input — evaluated against a vision-only baseline.