The human hand is one of the most staggeringly sophisticated and physiologically intricate parts of the body. It has more than 30 muscles, 27 joints alongside a network of ligaments and tendons that give it 27 degrees of freedom. There are more than 17,000 touch receptors and nerve endings in the palm alone. These features allow our hands to perform a dazzling array of highly complex tasks through a broad range of different movements.
But you don't need to tell any of that to Sarah de Lagarde.
但是,你沒必要讓莎拉·德拉加德知道這些了。
0
In August 2022, she was on top of the world. She had just climbed Mount Kilimanjaro with her husband and was supremely fit. But just one month later, she found herself lying in a hospital bed, with horrific injuries.
While returning home from work, De Lagarde slipped and fell between a tube train and the platform at High Barnet station in London. Crushed by the departing train and another that then came into the station, she lost her right arm below the shoulder and part of her right leg.
After the long healing process, she was offered a prosthetic arm by the UK's National Health Service, but it offered her little in terms of normal hand movement. Instead, it seemed to prioritise form over functionality.
"It doesn't really look like a real arm," she says. "It was deemed creepy by my children."
她說:“假肢看起來并不像真實的手臂,我的孩子們看到它會感到害怕”。
0
The prosthetic only featured a single joint at the elbow while the hand itself was a static mass on the end. For nine months she struggled to perform the daily tasks she had previously taken for granted, but then was offered something transformational – a battery-powered bionic arm utilising artificial intelligence (AI) to anticipate the movements she wants by detecting tiny electrical signals from her muscles.
"Every time I make a movement it learns," De Lagarde says. "The machine learns to recognise the patterns and eventually it turns into generative AI, where it starts predicting what my next move is."
Even picking up something as simple as a pen, and fiddling it in our fingers to adopt a writing position involves seamless integration between body and brain. Hand-based tasks that we perform with barely a thought require a refined combination of both motor control and sensory feedback – from opening a door to playing a piano.
With this level of complexity, it's no wonder that attempts to match the versatility and dexterity of human hands have evaded medical professionals and engineers alike for centuries. From the rudimentary spring-loaded iron hand of a 16th-Century German knight to the world's first robotic hand with sensory feedback created in 1960s Yugoslavia, nothing has come close to matching the natural abilities of the human hand. Until now.
Advances in AI are ushering in a generation of machines that are getting close to matching human dexterity. Intelligent prostheses, like the one De Lagarde received, can anticipate and refine movement. Soft-fruit picking bots can pluck a strawberry in a field and place it delicately in a punnet of other berries without squishing them. Vision-guided robots can even carefully extract nuclear waste from reactors. But can they really ever compete with the amazing capabilities of the human hand?
I recently gave birth to my first child. Within moments of entering the world, my daughter's small hand wrapped softly around my partner's forefinger. Unable to focus her eyes on anything more than a few inches in front of her, her hand and arm movements are limited, on the whole, to involuntary reflexes that allow her to grip an obxt when it is placed in her palm. It is an adorable illustration of the sensitivity of our dexterity, even in our earliest moments – and hints at how much it improves as we mature.
Over the coming months, my daughter's vision will progress enough to give her depth perception, while the motor cortex of her brain will develop, giving her increasing control over her limbs. Her involuntary grasps will give way to more deliberate grabbing actions, her hands feeding signals back to her brain, allowing her to make fine adjustments in movement as she feels and explores the world around her. It will take my daughter several years of determined effort, trial, error and play to attain the level of hand dexterity that adults possess.
And much like a baby learning how to use their hands, dexterous robots utilising embodied AI follow a similar roadmap. Such robots must co-exist with humans in an environment, and learn how to carry out physical tasks based on prior experience. They react to their environment and fine-tune their movements in response to such interactions. Trial and error plays a big part in this process.
"Traditional AI handles information, while embodied AI perceives, understands, and reacts to the physical world," says Eric Jing Du, professor of civil engineering at the University of Florida. "It essentially endows robots with the ability to 'see' and 'feel' their surrounding environments, enabling them to perform actions in a human-like manner."
But this technology is still in its infancy. Human sensory systems are so complex and our perceptive abilities so adept that reproducing dexterity at the same level as the human hand remains a formidable challenge.
"Human sensory systems can detect minute changes, and rapidly adapt to changes in tasks and environments," says Du. "They integrate multiple sensory inputs like vision, touch and temperature. Robots currently lack this level of integrated sensory perception."
But the level of sophistication is rapidly increasing. Enter the DEX-EE robot. Developed by the Shadow Robot Company in collaboration with Google DeepMind, it's a three-fingered robotic hand that uses tendon-style drivers to elicit 12 degrees of freedom. Designed for "dexterous manipulation research", the team behind DEX-EE hope to demonstrate how physical interactions contribute to learning and the development of generalised intelligence.
Each one of its three fingers contains fingertip sensors, which provide real-time three-dimensional data on their environment, along with information regarding their position, force and inertia. The device can handle and manipulate delicate obxts including eggs and inflated balloons without damaging them. It has even learned to shake hands – something that requires it to react to interference from outside forces and unpredictable situations. At present, DEX-EE is just a research tool, not for deployment in real-world work situations where it could interact with humans.
Understanding how to perform such functions, however, will be essential as robots become increasingly present alongside people both at work and at home. How hard, for example, should a robot grip an elderly patient as they move them onto a bed?
One research project at the at the Fraunhofer IFF Institute in Madgeburg, Germany, set up a simple robot to repeatedly "punch" human volunteers in the arm a total of 19,000 times to help its algorithms learn the difference between potentially painful and comfortable forces. But some dexterous robots are already finding their way into the real world.
Roboticists have long dreamed of automata with anthropomorphic dexterity good enough to perform undesirable, dangerous or repetitive tasks. Rustam Stolkin, a professor of robotics at the University of Birmingham, leads a project to develop highly dexterous AI-controlled robots capable of handling nuclear waste from the energy sector, for example. While this work typically uses remotely-controlled robots, Stolkin is developing autonomous vision-guided robots that can go where it is too dangerous for humans to venture.
Perhaps the most well-known example of a real-world android is Boston Dynamics' humanoid robot Atlas, which captivated the world back in 2013 with its athletic capabilities. The most recent iteration of Atlas was unveiled in April 2024 and combines computer vision with a form of AI known as reinforcement learning, in which feedback helps AI systems to get better at what they do. According to Boston Dynamics, this allows the robot to perform complex tasks like packing or organising obxts on shelves.
But the skills required to perform many of the tasks in human-led sectors where robots such as Atlas could take off, such as manufacturing, construction and healthcare, pose a particular challenge, according to Du.
"This is because the majority of the hand-led motor actions in these sectors require not only precise movements but also adaptive responses to unpredictable variables such as irregular obxt shapes, varying textures, and dynamic environmental conditions," he says.
Du and his colleagues are working on highly-dexterous construction robots that use embodied AI to learn motor skills by interacting with the real world.
At present, most robots are trained on specific tasks, one at a time, which means they struggle to adapt to new or unpredictable situations. This limits their applications. But Du argues that this is changing. "Recent advancements suggest that robots could eventually learn adaptable, versatile skills that enable them to handle a variety of tasks without prior specific training," he says.
Tesla also gave its own humanoid robot Optimus a new hand at the end of 2024. The company released a video of the bot catching a tennis ball in mid-air. However, it was tele-operated by remote manual control, rather than autonomous, according to the engineers behind it. The hand has 25 degrees of freedom, they claim.
But while some innovators have sought to recreate human hands and arms in machine form, others have opted for very different approaches to dexterity. Cambridge based robotics company Dogtooth Technologies has created soft fruit-picking robots, with highly dexterous arms and precision pincers capable of picking and packing delicate fruits like strawberries and raspberries at the same speed as human workers.
The idea for the fruit-picking robots came to co-founder and chief executive Duncan Robertson while he was lying on a beach in Morocco. With a background in machine learning and computer vision, Robertson wanted to apply his skills to help clean up the litter on the beach, by creating a low-cost robot which could identify, sort, and remove detritus. When he returned home, he applied the same logic to soft fruit farming.
The robots he developed along with the team at Dogtooth use machine learning models to deploy some of the skills that we as humans possess instinctively. Each of the robot's two arms has two colour cameras, much like eyes, which allow them to identify the ripeness of the berries and determine the depth of each of the target fruits from its end "effector", or gripping device.
The robots map the dispersal and arrangement of ripe fruits on a plant and turn this into a sequence of actions, with precise route planning necessary in order to guide the picker arm to the fruit's stem in order to make a cut.
Dogooth's robot's arms each have seven degrees of freedom, the same as the human arm, meaning these appendages can manoeuvre well enough to find the optimal angle for reaching each berry without damaging others still on the plant. The grasping device then gently grips the fruit by the stem, passing it into an inspection chamber before carefully placing the berry in a punnet for distribution. Another strawberry-picking system, created by Octinion, uses soft grippers to grasp the fruit as it transfers it from plant to basket.
@DavidStruveDesigns
The pressure feedback system is a real game-changer. Able-bodied people with all their limbs don't tend to realise how amazing it is that we can use our hands whilst hardly ever looking at them, and yet get the perfect level of grip on an obxt pretty much every time. When you're suddenly made aware of it, like watching a video such as this one, it makes you realise just how complex and amazing the human body is.
and how much of our bodily actions/functions are almost completely automated, without need for concious processing, focus or thought. Also makes you wonder, just how much of me is actually me in control, and how much is my body just a pre-programmed robot that I'm merely occupying and taking a ride in.
@sheepasus
It will still take a while, the bionic legs are really close to being as good as normal legs, at least for most daily living tasks, but hands are such a biological marvel, they're really difficult to replace a normal hand, as they're just not space efficient. Most bionic hands have either motors in the socket, or in the hand, or even both, and even then, we can only fulfill a partial functionality, because we do not have enough space for the additional motors needed for a fully functional hand. In fact, there is still an over 40% rejection rate to bionic hand prosthetics.
@PBST_RAIDZ
It's actually a step ahead as in star wars you can't feel stuff with your prospectic hand or arm.
這其實是一大進步,因為在《星球大戰(zhàn)》中,你無法用假手或假肢感知物體。
@erikmckoul2478
Next they need to figure out how to attach them to a robotic arm for people who can't lift their arms, so I can eat chips without help for once.
@radamanthys0223
it's indeed been moving at a faster pace in the last 10 years or so, neural interfaces and direct to bone fitting are the next steps, this company that makes the BIOM foot is/was working on such neural interfaces though I haven't heard much from them in the last 5 years or so. Mechanically we are already able to produce prosthesis with close to natural degrees of freedom (cue the HK university ankle desgin that can tiptoe). next 10 years on the tech seem promising
@Rizzob17
It makes me think of Terminator. Especially with AI tech accelerating at a much faster rate. Think of those evil enough to pass along their biases to AI, put it in a full cyborg with immense strength, speed and skill.
@therealgoogas8700
In at most like 10 years if the way things are going don’t get better we gonna be plunged into a mini dark age for a while more and more people can’t afford to buy a home and with crop failures world wide food shortages will become common place
@Optimumlabsllc
I make sockets and I can tell you, sockets they won’t go obsolete. Right now bone integration have a failure rate that can cause you to become even more of an amputee. Not to mention there will always be people who don’t want a rod sticking out of their body permanently. I had a patient that fell on his once. Splintered what was left of his bone. Now, As the bone integration get better, these problems should subside but sockets won’t be obsolete. Especially with people like me, innovating to make sure every patient has the most comfortable experience with my prosthetics.
@Derpachu
Amazing advancements. I’m curious if it’s possible to automate the twisting of the joints. It might take some innovation but after that we pretty much got fully functional hands. Would need some more programming to make the fingers swivel but after that we have hands that could do everything natural hands can do and possibly better. Attachment to bones and nervous system would probably be the next step
原創(chuàng)翻譯:龍騰網(wǎng) http://m.top-shui.cn 轉載請注明出處
The pressure feedback system is a real game-changer. Able-bodied people with all their limbs don't tend to realise how amazing it is that we can use our hands whilst hardly ever looking at them, and yet get the perfect level of grip on an obxt pretty much every time. When you're suddenly made aware of it, like watching a video such as this one, it makes you realise just how complex and amazing the human body is.
壓力反饋系統(tǒng)是一項真正的變革性技術。四肢健全的人往往意識不到,我們使用雙手時幾乎從來不會注視它們,但幾乎每次都能恰到好處地握住物體,這是多么神奇的事情。當你突然意識到這一點時,比如通過觀看這樣的視頻,你就會明白人體是多么復雜多么神奇了。
我們身體的很多動作和功能幾乎完全是自動進行的,無需下意識地處理、專注、思考。這也會讓你不禁思考,自己在多大程度上真正是由自己控制的,我的身體在多大程度上只是一臺預編程的機器人,而我不過是在使用和乘坐這臺機器人而已。
I'm having a hard time believing that we're so close to actual functioning Star Wars technology. What a time to be alive!
我簡直不敢相信,我們距離實現(xiàn)《星球大戰(zhàn)》里的技術這么近了,活在這個時代,真是太棒了!
Yes, muscul sensors are from 70-s - real star wars tehnology lol
是的,肌肉傳感器是上世紀70年代的產(chǎn)物——這可是真正的《星球大戰(zhàn)》式技術,哈哈。
It will still take a while, the bionic legs are really close to being as good as normal legs, at least for most daily living tasks, but hands are such a biological marvel, they're really difficult to replace a normal hand, as they're just not space efficient. Most bionic hands have either motors in the socket, or in the hand, or even both, and even then, we can only fulfill a partial functionality, because we do not have enough space for the additional motors needed for a fully functional hand. In fact, there is still an over 40% rejection rate to bionic hand prosthetics.
還有一定的差距。仿生腿至少在大多數(shù)日常任務中幾乎可以媲美人腿了,但人手真的是一大生物奇跡,仿生手很難代替人手,因為空間利用率不夠。大多數(shù)仿生手的電機要么安裝在接受腔,要么安裝在手部,也有兩處都安裝的。但即便如此,我們也只能實現(xiàn)部分功能,因為我們沒有足夠空間為功能完備的仿生手安裝更多所需的電機。事實上,仿生假肢的棄用率仍超過40%。
It's actually a step ahead as in star wars you can't feel stuff with your prospectic hand or arm.
這其實是一大進步,因為在《星球大戰(zhàn)》中,你無法用假手或假肢感知物體。
Next they need to figure out how to attach them to a robotic arm for people who can't lift their arms, so I can eat chips without help for once.
接下來,他們需要研究如何將仿生手安裝到機械臂上,供那些無法抬起手臂的人使用,這樣我就終于能不用別人幫忙自己吃薯片了。
it's indeed been moving at a faster pace in the last 10 years or so, neural interfaces and direct to bone fitting are the next steps, this company that makes the BIOM foot is/was working on such neural interfaces though I haven't heard much from them in the last 5 years or so. Mechanically we are already able to produce prosthesis with close to natural degrees of freedom (cue the HK university ankle desgin that can tiptoe). next 10 years on the tech seem promising
在過去十年左右的時間里,假肢技術的發(fā)展速度確實加快了。接下來的發(fā)展方向是神經(jīng)接口和骨整合技術。制造BIOM仿生腿的這家公司之前在研發(fā)這類神經(jīng)接口,但大約五年時間過去了,我沒聽說他們?nèi)〉昧硕啻筮M展。從機械技術的角度,我們已經(jīng)能夠制造自由度接近自然肢體的假肢了(例如:中國香港(特區(qū))大學設計的能夠踮腳的踝關節(jié))??磥砦磥硎辏@項技術前景廣闊。
It makes me think of Terminator. Especially with AI tech accelerating at a much faster rate. Think of those evil enough to pass along their biases to AI, put it in a full cyborg with immense strength, speed and skill.
這讓我想起了《終結者》。尤其是在人工智能技術極大加快發(fā)展步伐的當下。想想那些惡人把自己的偏見灌輸給人工智能,再把這樣的人工智能應用于發(fā)展成熟的賽博格(半機器人),它們擁有超強的力量、速度、技能。
原創(chuàng)翻譯:龍騰網(wǎng) http://m.top-shui.cn 轉載請注明出處
In at most like 10 years if the way things are going don’t get better we gonna be plunged into a mini dark age for a while more and more people can’t afford to buy a home and with crop failures world wide food shortages will become common place
照目前的情況,如果在最多10年左右的時間里得不到改善,我們將會陷入短暫的“小黑暗時代”。屆時越來越多的人將買不起房,由于全球農(nóng)作物歉收,糧食短缺將成為普遍現(xiàn)象。
I make sockets and I can tell you, sockets they won’t go obsolete. Right now bone integration have a failure rate that can cause you to become even more of an amputee. Not to mention there will always be people who don’t want a rod sticking out of their body permanently. I had a patient that fell on his once. Splintered what was left of his bone. Now, As the bone integration get better, these problems should subside but sockets won’t be obsolete. Especially with people like me, innovating to make sure every patient has the most comfortable experience with my prosthetics.
我制造假肢接受腔,我可以告訴你,假肢接受腔不會被淘汰。目前,骨整合技術存在一定的失敗率,這可能導致截肢患者雪上加霜,更何況始終有人不希望自己的身體永久性突出一根棍子。我有個患者在使用骨整合假肢時摔倒了,導致殘留的骨頭碎裂。如今隨著骨整合技術的進步,這些問題應該會有所緩解,但假肢接受腔不會被淘汰。尤其是有我這樣的人,通過不斷創(chuàng)新來確保每位患者在使用我制造的假肢時,都能獲得最舒適的體驗。
Amazing advancements. I’m curious if it’s possible to automate the twisting of the joints. It might take some innovation but after that we pretty much got fully functional hands. Would need some more programming to make the fingers swivel but after that we have hands that could do everything natural hands can do and possibly better. Attachment to bones and nervous system would probably be the next step
這是驚人的進步。我好奇我們能否實現(xiàn)關節(jié)的自動扭轉。這可能需要一些創(chuàng)新,但只要做到這一點,我們基本上就有功能完備的仿生手了。我們需要更多的編程才能實現(xiàn)手指的扭轉,但只要做到這一點,仿生手就能做到人手所能做的一切事情,甚至可能更加出色。下一個目標可能是仿生手連接骨骼和神經(jīng)系統(tǒng)。
原創(chuàng)翻譯:龍騰網(wǎng) http://m.top-shui.cn 轉載請注明出處