For years, "robot hands" have meant one of two things: a parallel-jaw gripper that looks like a pair of pliers, or a stiff plastic mannequin claw bolted onto the wrist of a humanoid that could pose for a press photo but couldn’t actually do anything useful with its fingers. Newer hands on updated humanoid robots are better, of course, but you still see the immobile claws sometimes on robots that are showing off their dance moves or running a marathon , but not really doing any useful work.

This morning, San Carlos and Paris-based startup Genesis AI emerged from “a year of quiet building” with GENE-26.5, what the company is calling "the first robotic brain to give robots human-level physical manipulation capabilities." The launch video — which is getting significant attention on X as I write this — shows a robot hand doing some seriously impressive things: cooking a 20-step meal with actions like cracking eggs, chopping tomatoes, adding salt, then serving a breakfast scramble alongside a drink it mixed. Then it switches gears and applies electrical tape to wires (!!!) and solves a Rubik’s cube. Importantly – according to Genesis AI – it’s working in fully autonomous mode and shown at 1x speed, not sped up like many robot videos.

It also grabs a straw out of a container, and separates a plastic cup from a stack of multiple others: something that is hard even for humans sometimes. It’s one of the most impressive robot manipulation demos I’ve ever seen, similar in some ways to Kyber Labs’ new robot hands .

"Manipulation is the most valuable problem in robotics," Genesis AI says . "It is also the hardest unsolved problem."

Backed by $105 million from Eclipse, Khosla Ventures, Bpifrance, Eric Schmidt, and Xavier Niel — one of France's largest early-stage rounds — Genesis is building the whole stack: a robotics-native foundation model, a 1:1 human-like robotic hand, a noninvasive data collection glove for capturing motion, force, and touch from human workers, and a simulator that compresses weeks of physical experiments into minutes. The pitch is that data has been the bottleneck for robotics foundation models, and the only way past it is to capture human dexterity directly, in the wild, at scale.

If the demo holds up outside a controlled environment, it's a genuinely big deal.

"Literally zero robot hands deployed right now"

A few weeks ago I sat down with Tyler Habowski and Yonatan Robbins, the co-founders of Kyber Labs , for a TechFirst episode about the hardware behind dexterous manipulation. Habowski came out of SpaceX. Robbins comes from medical devices and industrial design. They’ve built a hand that, in the demo videos they shared with me, also does some genuinely impressive things, and they share a similar perspective on the value of highly capable robot hands.

"There are literally zero robot hands deployed right now doing routine work," Habowski told me. "The best hands are hundreds of thousands of dollars, and they break all the time."

Ouch. That’s not what humanoid robot makers want to hear.

We are in a moment where humanoid robotics companies are raising at multi-billion-dollar valuations, where Genesis is announcing the "endgame," where every other week a new demo video racks up millions of views. And a engineer who came out of SpaceX, who actually builds this stuff, says the installed base of dexterous robot hands doing real work in the real world is, functionally, zero.

That gap between what the demos show and what’s actually deployed is the most important gulf in robotics right now. It’s the gap that limits robotic SKU coverage and, therefore, limits robotic effectiveness in non-standardized applications. Fixing this with a durable solution immediately puts humanoid robots – and other kinds of robots – in the running for all kinds of higher-skill jobs.

But how you close this gap depends a lot on which problem you think you're solving.

Two bets on the same problem

In a way, Genesis AI and Kyber Labs are betting on opposite ends of the stack. Or, at least, they’ve approached the problem of dextrous manipulation from different directions.

Genesis’s bet is that the AI model is the moat. Build the best foundation model, feed it the largest and most diverse human dexterity dataset you can collect (via the glove), train against the fastest simulator ever built, and the hardware becomes a delivery mechanism. Their hand exists because they couldn’t buy what they needed, not because hardware is their focus. CEO Zhou Xian told TechCrunch the company "decided to go full stack" only after realizing they needed control over the hardware.

The endgame is general-purpose physical AI. The path is data plus compute, and data comes from hardware in motion.

Kyber’s bet started somewhere else: hardware first, software later.

"When I first wanted to build this company, I wanted to build the software," Habowski told me. "I was like, okay, great, let's get some software engineers together. Then I looked at what was available, and it was: the best hands are hundreds of thousands of dollars and they break all the time. So we needed to build our own hardware."

That hardware bet has a specific thesis behind it: humans aren't precise, we're force-driven.

"If I tell you to put your fingers 23.4 millimeters apart, you have no idea," Habowski said. "But if I tell you to apply just enough force to pick up a potato chip and not break it, you can do that all day long."

Most robot hands optimize for the wrong thing: they mimic the kinematics of a human hand (how it moves) without mimicking the actuation modality (how it feels and reacts). That’s why robot hands can look human in photos but feel rigid and clumsy in operation.

Kyber’s design strips out the gearboxes most hands rely on — "you can’t feel anything on the other side of a 300-to-1 gearbox" — and uses what they call torque-transparent actuation. The motors are the force sensors. The hand can detect a feather laid across a finger by measuring impedance in the motor itself: no tactile sensor required, and no expensive force-torque sensor stack.

The best part in a machine is no part, Habowski says: one of about a hundred SpaceX mindsets he absorbed by osmosis.

Kyber’s first commercial deployment isn’t a humanoid. It’s a stationary system at a clinical lab whose technicians sit at benches all day picking up pipettes, uncapping source tubes, mixing samples and following established procedures.

"We're not pitching magical general-purpose autonomy or the orb that will control all robots and do anything you could possibly ask it on day one," Habowski said. "We're pitching a path to get there that is much more pragmatic."

For now, that means no legs, no wheels, and no massive focus on a solve-it-all vision-language-action model. It’s just hands that work on an off-the-shelf arm, doing one job well.

Everyone knows we need to go beyond the demo

The Genesis AI demo is literally awesome. Who doesn’t want a humanoid that, in addition to cleaning up after us, can make us great-tasting and healthy meals?.

So GENE-26.5 looks a very meaningful capability jump, and Genesis AI’s data engine plus glove approach is a smart way to attack the data scarcity problem that is limiting robotics foundation models.

But what we also need is durability at reasonable cost. Genesis isn’t saying yet what their hand costs to manufacture, or how long it lasts under continuous load, or what the service model looks like. All of that matters. As Kyber Labs’ Yonatan Robbins told me: a robot hand has to cost less than the person doing the same job. A hand that needs replacing very week or month won’t result in a very affordable robot.

And a robotic hand has to do so many different things.

One that an executive talked to me a year ago: can a robot wash its hands? It’ll need to, after cracking eggs. No-one wants egg white on their vacuum cleaner handle, or, worse, front door knob.

I’m sure we’ll get there on both cost and capability, and maybe both Kyber Labs and Genesis AI will get there. But it’s likely to be a few generations down the road.

Genesis's bet is that a foundation model plus a great hand can clear that long tail. Kyber's bet is that you'll get to the long tail faster by deploying narrower, working hard at force-control hardware, and iterating on real data from real workbenches.

It’s possible both are correct.

I asked Habowski what breakthroughs the field still needs. Better motors? Better sensors? His answer surprised me.

"We need to deploy and iterate. We don't know what we need right now until you actually test."

He compared it to NASA versus SpaceX. NASA spent decades engineering the perfect rocket from every conceivable requirement, and built the Space Shuttle: a great rocket on some measures, with no iteration loop.

SpaceX iterated every single launch. "Through the first 50 launches, no two of those rockets were the same."

There’s a story I told them about a pottery class, probably from the book Art & Fear by Ted Orland and David Bayles. The teacher splits the class in two. One half is graded on quantity: just make as many pots as you can. The other half is graded on quality: make one perfect pot.

Surprise, surprise: at the end of the term, the best pots all come from the quantity group. They got better by making more , breaking more, learning from the wreckage.

That’s exactly Habowski’s point.

Robotic hands are somewhere between the perfect pot and the fiftieth pot. The Genesis demo is gorgeous. Kyber’s pipette-uncapping clinical lab system is also super-impressive. But both need to be in operation, whether it’son a workbench in a hospital lab running blood tests, or in a commercial-grade kitchen for a burger joint.

Both bets can be right. Both bets need more use, testing, data, learning, and iteration.

The endgame Genesis is gesturing at — general-purpose hands driven by a robotics-native foundation model — is probably the destination. But the path Kyber is describing — vertically integrated, narrow, deployed, iterating against actual customer workflows — might be more aligned to how you actually find out which parts you’re getting wrong.

Robot hands really are getting insanely good. Certainly in demos, in research labs and on a small number of carefully chosen workbenches.

The version where there are millions of them folding your laundry, prepping your eggs and threading the nuts in your car’s transmission is still gated by hardware that doesn’t break, software that generalizes, and unit economics that pencil out below the cost of a human.