ORCA Dexterity Unveils Trio of Open-Source Robot Hands from $1,500

In a move that’s set to give researchers and developers a literal helping hand, ORCA Dexterity has pulled back the curtain on a trio of new open-source robotic hands, complete with some seriously sophisticated tactile sensing. The firm has announced that all source and print files for the base models will be released into the wild, allowing users to 3D print and tweak the hardware to suit their specific physical AI research needs.

The lineup kicks off with the surprisingly accessible orcahand lite, a 9 Degrees of Freedom (DoF) adaptive hand starting at roughly £1,150 ($1,500). Stepping up a gear is the standard orcahand, which offers a more human-like 17 DoF for around £2,700 ($3,500). But the real showstopper is the orcahand touch: a 17-DoF model that comes fully kitted out with custom tactile sensors across all five digits, starting at £4,700 ($6,100). For those looking for the ultimate setup, a fully-specced pair can nudge an eye-watering £13,800 ($17,900).

The “touch” model is where the magic really happens. It’s packed with 351 taxels per hand, each capable of outputting a full 3D force vector. This allows the hand to sense shear, slip, and normal forces simultaneously with a sensitivity of 0.1 Newtons and a spatial resolution of 1 mm. All three models are designed to be field-repairable, mount to the ISO 9409-1 standard, and run on the same open-source firmware, orca_core, which is already live on GitHub.

A product shot from the ORCA Dexterity website showing the three new robotic hand models.

Why does this matter?

ORCA Dexterity is making a blatant play to democratise access to high-fidelity robotic manipulation. By open-sourcing the hardware and firmware, the company is effectively lowering the barrier to entry for top-tier robotics research. While commercial systems with this level of sensitivity usually come with astronomical price tags and are locked inside proprietary “walled gardens,” ORCA is offering a customisable, repairable, and—at the entry-level—genuinely affordable platform. This could significantly kickstart innovation in areas like dexterous grasping, human-robot interaction, and the development of more capable physical AI agents that can actually feel the world around them, rather than just bumping into it.