• English is the official language of this forum. Posts in other languages will receive a warning, except in threads where foreign languages are permitted.

Figure robots - Updates and Discussion

Hamartia Antidote

Elite Member
Nov 17, 2013
38,511
22,600
Country of Origin
Country of Residence




 

Hamartia Antidote

Elite Member
Nov 17, 2013
38,511
22,600
Country of Origin
Country of Residence

Figure's humanoid can now watch, learn and perform tasks autonomously​


Figure's Brett Adcock claimed a "ChatGPT moment" for humanoid robotics on the weekend. Now, we know what he means: the robot can now watch humans doing tasks, build its own understanding of how to do them, and start doing them entirely autonomously.

General-purpose humanoid robots will need to handle all sorts of jobs. They'll need to understand all the tools and devices, objects, techniques and objectives we humans use to get things done, and they'll need to be as flexible and adaptable as we are in an enormous range of dynamic working environments.

They're not going to be useful if they need a team of programmers telling them how to do every new job; they need to be able to watch and learn – and multimodal AIs capable of watching and interpreting video, then driving robotics to replicate what they see, have been taking revolutionary strides in recent months, as evidenced by Toyota's incredible "large behavior model" demonstration in September.

But Toyota is using bench-based robot arms, in a research center. Figure, like Tesla, Agility, and a growing number of other companies, is laser-focused on self-sufficient full-body humanoids that can theoretically go into any workplace and eventually learn to take over any human task. And these are not research programs, these companies want products out there in the market yesterday, starting to pay their way and get useful work done.

Adcock told us he hoped to have the 01 robot deployed and demonstrating useful work around Figure's own premises by the end of 2023 – and while that doesn't seem to have transpired at this point, a watch-and-learn capability in a humanoid is indeed big news.

The demonstration in question, mind you, is not the most Earth-shatteringly impressive task; the Figure robot is shown operating a Keurig coffee machine, with a cup already in it. It responds to a verbal command, opens the top hatch, pops a coffee pod in, closes the hatch and presses the button, and lets the guy who asked for the coffee grab the full cup out of the machine himself.

So yes, it's fair to say the human and the Keurig machine are still doing some heavy lifting here – but that's not the point. The point is, the Figure robot took 10 hours to study video, and can now do a thing by itself. It's added a new autonomous action to its library, transferrable to any other Figure robot running on the same system via swarm learning.

If that learning process is robust across a broad range of different tasks, then there's no reason why we shouldn't start seeing a new video like this every other day, as the 01 learns to do everything from peeling bananas, to putting pages in a ring binder, to screwing jar lids on and off, to using spanners, drills, angle grinders and screwdrivers.

It shouldn't be long before it can go find a cup in the kitchen, check that the Keurig's plugged in and has plenty of water in it, make the damn press-button coffee, and bring it to your desk without spilling it – a complex task making use of its walking capabilities and Large Language Model AI's ability to break things down into actionable steps.

So don't get hung up on the coffee; watch this space. If Figure's robot really knows how to watch and learn now, we're going to feel a serious jolt of acceleration in the wild frontier of commercial humanoid robotics as 2024 starts to get underway. And even if Figure is overselling its capabilities – not that any tech startup would dream of doing such a thing – it ain't gonna be long, and there's a couple dozen other teams manically racing to ship robots with these capabilities.

Make no mistake: humanoid robots stand to be an absolutely revolutionary technology once they're deployed at scale, capable of fundamentally changing the world in ways not even Adcock and the other leaders in this field can predict. The meteoric rise of GPT and other language model AIs has made it clear that human intelligence won't be all that special for very long, and the parallel rise of the humanoids is absolutely designed to put an end to human labor.

Things are happening right now that would've been absolutely unthinkable even five years ago. We appear to be right at the tipping point of a technological and societal upheaval bigger than the agricultural or industrial revolutions, that could unlock a world of unimaginable ease and plenty, and/or possibly relegate 95% of humans to the status of zoo animals or house plants
 

Hamartia Antidote

Elite Member
Nov 17, 2013
38,511
22,600
Country of Origin
Country of Residence

BMW will deploy Figure’s humanoid robot at South Carolina plant​

Screenshot-2023-10-13-at-3.53.07%E2%80%AFPM.jpg


Figure today announced a “commercial agreement” that will bring its first humanoid robot to a BMW manufacturing facility in South Carolina. The Spartanburg plant is BMW’s only in the United States. As of 2019, the 8 million-square-foot campus boasted the highest yield among the German manufacturer’s factories anywhere in the world.

BMW has not disclosed how many Figure 01 models it will deploy initially. Nor do we know precisely what jobs the robot will be tasked with when it starts work. Figure did, however, confirm with TechCrunch that it is beginning with an initial five tasks, which will be rolled out one at a time.

While folks in the space have been cavalierly tossing out the term “general purpose” to describe these sorts of systems, it’s important to temper expectations and point out that they will all arrive as single- or multi-purpose systems, growing their skillset over time. Figure CEO Brett Adcock likens the approach to an app store — something that Boston Dynamics currently offers with its Spot robot via SDK.


Likely initial applications include standard manufacturing tasks such as box moving, pick and place and pallet unloading and loading — basically the sort of repetitive tasks for which factory owners claim to have difficulty retaining human workers. Adcock says that Figure expects to ship its first commercial robot within a year, an ambitious timeline even for a company that prides itself on quick turnaround times.

The initial batch of applications will be largely determined by Figure’s early partners like BMW. The system will, for instance, likely be working with sheet metal to start. Adcock adds that the company has signed up additional clients, but declined to disclose their names. It seems likely Figure will instead opt to announce each individually to keep the news cycle spinning in the intervening 12 months.

Unlike some other humanoid designers (including Agility), Figure is focused on creating a dexterous, human like hand for manipulation. The thinking behind such an end effector is the same that’s driving many toward the humanoid form factor in the first place: Namely, we’ve designed our workspaces with us in mind. Adcock alludes to Figure 01 being tasked with an initial set of jobs that require high dexterity.

As for the importance of legs, the executive suggests that their importance for maneuvering during certain tasks is as — or more — important than things like walking up stairs and over uneven terrain, which tend to get most of the love during these conversations.

Training, meanwhile, will involve a mix of approaches, including reinforcement learning, simulation and teleoperation to help the robot out of potential jams. Figure 01 will very much be learning on the job, as well, refining its approach during real-world testing, much as we humans do. As for whether the systems will be long-term additions to the BMW line, that depends entirely on whether the robots are able to meet the automaker’s internal expectations of output. Meantime, Figure is effectively leasing the systems through RaaS (robotics as a service), a model it expects to maintain for the foreseeable future.
 

Users who are viewing this thread

Top