About 50 results
Open links in new tab
  1. Advancing embodied AI through progress in touch perception, …

    Oct 31, 2024 · Our partnership with Wonik Robotics is poised to create a new advanced dexterous robot hand, fully integrated with tactile sensing leveraging Meta Digit Plexus. Wonik Robotics will …

  2. AI Research - AI at Meta

    Jun 11, 2025 · From robots that can move around, interact with objects, to help accomplish household tasks, to wearable glasses that understand the real and digital world and support people throughout …

  3. Introducing V-JEPA 2 - ai.meta.com

    We train V-JEPA 2 on 62 hours of robot data from the Droid dataset, then deploy it on a robot arm in new environments. By specifying tasks as goal images, the model accomplishes tasks like reaching, …

  4. Embodied AI: Toward effective collaboration between humans

    Oct 20, 2023 · Our vision for socially intelligent robots goes beyond the current paradigm by considering dynamic environments where humans and robots interact with each other and the environment …

  5. Introducing the V-JEPA 2 world model and new benchmarks for …

    Jun 11, 2025 · Meta Video Joint Embedding Predictive Architecture 2 (V-JEPA 2) is a world model that achieves state-of-the-art performance on visual understanding and prediction in the physical world. …

  6. Advancing machine intelligence through human-centered research

    Feb 7, 2025 · The work we’re sharing includes the Meta PARTNR dataset and benchmark, aimed at building socially intelligent robots that can assist people in everyday tasks, such as grabbing a …

  7. EgoMimic: Georgia Tech PhD student uses Project Aria Research …

    Feb 19, 2025 · Today, we’re highlighting new research from Georgia Tech that helps train robots to perform basic everyday tasks using egocentric recordings from wearers of Meta’s Project Aria …

  8. OpenEQA: From word models to world models - Meta AI

    Apr 11, 2024 · Imagine an embodied AI agent that acts as the brain of a home robot or a stylish pair of smart glasses. Such an agent needs to leverage sensory modalities like vision to understand its …

  9. Robots that learn from human videos and simulated interactions

    Mar 31, 2023 · First, we’ve built a way for robots to learn from real-world human interactions, by training a general-purpose visual representation model (an artificial visual cortex) from a large number of …

  10. PARTNR: A Benchmark for Planning and Reasoning in ... - ai.meta.com

    Oct 31, 2024 · We present a benchmark for Planning And Reasoning Tasks in humaN-Robot collaboration (PARTNR) designed to study human-robot coordination in household activities.