Meta researchers have created an artificial visual cortex to give robots vision

Meta researchers have created an artificial visual cortex to give robots vision

artificial visual cortex to give robots vision

ASUS new TUF B450M PRO GAMING motherboard game console 128G AM4 DDR4 motherboard supports R3 R5 R7 R9 AMD CPU processor;
Original price: USD 328.61;Now price: USD 243.17;
Click & Buy : ASUS new TUF B450M PRO GAMING motherboard

skill coordination and visual cortex replication

Meta researchers have created an artificial visual cortex to give robots vision.
Meta AI researchers announced several key developments related to adaptive skill coordination and visual cortex replication that will allow AI robots to function autonomously in the real world. These developments are a major step forward in creating a universal embodied AI, capable of interacting with the real world without human intervention.
The visual cortex is the area of the brain that allows organisms to use vision to perform actions. Thus, an artificial visual cortex is a key requirement for any robot that needs to perform tasks based on what it sees in front of it. The VC-1 artificial visual cortex is trained on the Ego4D dataset, which contains thousands of hours of video, from the wearable cameras of study participants around the world performing everyday activities such as cooking, cleaning, sports and crafts.
However, the visual cortex is only one of the elements of embodied AI. In order for a robot to work completely autonomously in the real world, it must be able to manipulate real—world objects — move to an object, lift it, move it to another place and place the object - and do all this based on what it sees and hears.
To solve this problem, Meta AI experts, in collaboration with researchers from the Georgia Institute of Technology, have developed a new ASC (Adaptive Skill Coordination) technology, where learning takes place in simulations, and then these skills are transferred to a real robot. Meta demonstrated the effectiveness of ASC in collaboration with Boston Dynamics. ASC has been integrated with the Spot robot, which has reliable recognition, navigation and manipulation capabilities, although it requires significant human intervention.
The researchers set a goal to create an AI model that can perceive the world using onboard sensors via the Boston Dynamics API. At first, ASC was trained in a Habitat simulator using HM3D and ReplicaCAD datasets containing 3D models of over a thousand homes. Then the virtual robot Spot was taught to move around an unfamiliar house, pick up objects, carry them and put them in the right place. Later, this knowledge was transferred to real Spot robots, which automatically performed the same tasks based on the received idea of the premises.
We used two completely different real—world environments, in which Spot was asked to rearrange various objects, - a fully furnished apartment with an area of 185 m2 and a university laboratory with an area of 65 m2. — Researchers report. —ASC achieved near-perfect performance, succeeding in 59 out of 60 episodes, overcoming hardware instabilities, selection failures, and adversarial interference such as moving obstacles or blocked paths.
Meta researchers are opening the source code of the VC-1 model, sharing detailed information about model scaling and data set sizes. The team's next goal will be to try to integrate VC-1 with ASC to create a unified system that will become closer to the true embodied AI.

AMD B550M Motherboard Gaming Motherboard

SOYO AMD B550M Motherboard Gaming Motherboard USB3.1 M.2 Nvme Sata3 Supports R5 3600 CPU (AM4 socket and R5 5600G 5600X CPU);
Original price: USD 108.80;Now price: USD 88.13;
Click & Buy : AMD B550M Motherboard Gaming Motherboard


Meta researchers have created an artificial visual cortex to give robots vision

No comments:

Post a Comment

Technology Updates

Technology Updates Huawei P60 P60 Pro and P60 Art smartphones On March 30, Huawei starts selling a new flagship series ...