封面
市場調查報告書
商品編碼
2007699

具身人工智慧機器人(包括VLA)的大規模模型(2026)

Embodied AI Robot Large Model (Including VLA) Research Report, 2026

出版日期: | 出版商: ResearchInChina | 英文 480 Pages | 商品交期: 最快1-2個工作天內

價格
簡介目錄

具身人工智慧機器人的大規模模型(以下簡稱「機器人大規模模型」)無需精確建模即可執行端到端或分層決策,與傳統機器人控制演算法相比,它們能夠在非結構化的開放環境(例如家庭、戶外、雜亂的桌面等)中運行。與通用大規模模型相比,具身人工智慧機器人的大規模模型更注重多模態資訊(視覺+雷射雷達+觸覺+文字等)的整合和理解,旨在完成物理世界中的閉合迴路操作,並輸出關節角度、速度和抓取力等運動指令。

近年來,嵌入式人工智慧機器人的大規模模式領域呈現以下發展趨勢:

1. 嵌入式人工智慧公司開始應用世界模型。

目前,視覺-語言-動作(VLA)模型等大規模機器人模型在「感知、決策和執行」的閉合迴路中取得了顯著進展,使機器人能夠理解指令並產生動作。然而,這些模型在應對物理世界的高度多樣性和不確定性時仍然面臨瓶頸。本質上,它們只是「模仿」訓練資料中的模式,缺乏對自身行為後果的預見性,也無法理解背後的物理邏輯。

世界模型的整合正是為了突破這一限制而設計的。世界模型的核心在於賦予機器人「想像未來」的能力。透過學習多模態數據,機器人可以建立物理環境的內部動態表徵,並基於當前狀態和計畫動作預測未來多個步驟的狀態變化。這意味著機器人有可能從被動的服從型指令者轉變為能夠進行「大腦推理」的主動決策者。例如,在執行「倒水」任務時,配備世界模型的機器人不僅可以識別杯子和水壺,還能在行動前預測水流軌跡、杯子傾斜角度以及溢出的可能性,從而規劃出更安全、更精準的動作定序。

2. 大型機器人模型可實現跨平台使用。

傳統的機器人開發模式需要針對每個機器人獨特的硬體配置(感測器、致動器、幾何結構)進行專門的軟體和演算法開發與最佳化,導致研發成本高、開發週期漫長且功能難以重複使用。而跨平台使用大規模機器人模型可以克服這些挑戰。透過建構一個穩健的端到端多模態基礎模型,我們可以將可遷移的通用智慧整合到機器人中,從而克服不同本體、任務和環境(例如人形機器人、四足機器人和機械臂)的局限性,並實現功能的快速泛化和部署。

3. 增加大規模機器人模型的開放原始碼。

開放原始碼大規模模型不僅僅是共用技術。開放原始碼模型能夠匯集全球開發者的智慧,快速解決現實世界中複雜的「長尾問題」。同時,開放原始碼打破了傳統的封閉式源經營模式,使中小企業能夠基於開放原始碼模型快速發展,集中資源進行硬體創新和實際應用,從而構建「巨頭搭建平台,百家企業在其上運行」的產業結構。

4. OEM廠商進入市場,是為了解決嵌入式AI機器人大規模模型缺乏真實世界資料的問題,並提供現場檢驗場景。

隨著眾多OEM廠商湧入嵌入式人工智慧和人形機器人市場,海量的工業場景資料、車規級感測器資料以及成熟的自動駕駛技術堆疊正被引入到大規模嵌入式人工智慧模型(如VLA模型、世界模型等)中。諸如電動車感知、多模態融合和端到端決策等演算法可以直接移植到機器人上,從而幫助訓練和提升模型的環境理解、任務規劃和運動控制能力。 OEM生產線場景可以檢驗大規模機器人模型的可靠性和成功率,同時揭示模型缺陷,並為未來的模型改進提供可靠的真實世界機器人互動數據,有效彌合模擬與現實之間的巨大差距。

本報告深入分析了具身人工智慧機器人的大規模模型市場,提供了有關基本概念、產業成長促進因素、關鍵技術發展方向以及主要企業和產品的資訊。

目錄

第1章:具身人工智慧機器人大規模模型概述及關鍵技術發展方向

  • 具身人工智慧機器人大規模模式的核心定義
  • 全球大型人工智慧機器人模型產業生態系統地圖
  • 嵌入式人工智慧機器人大規模模式的分類
  • 推動嵌入式人工智慧機器人大規模模式產業化發展的因素
  • 大規模嵌入式人工智慧機器人關鍵技術的發展方向
  • 嵌入式人工智慧機器人大規模模式的商業化方法

第2章:主要企業與產品:大型科技公司

  • 主要科技公司領先的具身人工智慧大規模建模產品概述(1)至(3)
  • Alibaba Group
  • NVIDIA
  • Google DeepMind
  • OpenAI
  • Microsoft
  • Huawei
  • Tencent RoboticsX
  • Baidu
  • ByteDance
  • iFlytek
  • SenseTime

第3章:主要企業與產品:機器人企業訓練營

  • 機器人公司代表性具身人工智慧大規模模型產品概述(1)至(3)
  • UBTECH Robotics(UBTECH)
  • Unitree Robotics
  • AgiBot
  • Leju Robotics
  • Galbot
  • RobotEra
  • FigureAI
  • Sanctuary AI
  • 1X Technologies
  • Neura Robotics

第4章:主要企業與產品:跨境OEM陣營

  • 各OEM廠商代表性壓紋AI大型模型產品概述(1)至(4)
  • Tesla
  • Toyota
  • Honda
  • Hyundai
  • Xiaomi
  • XPeng
  • GAC Group
  • Chery
  • Leapmotor
  • BYD
  • Dongfeng Motor
  • 其他主要全球OEM廠商在具身人工智慧機器人領域的佈局總結(1)-(4)
簡介目錄
Product Code: BHY011

Research on Robot Large Models: World Models Are About to Become Standard, and OEMs Enter and Accelerate Mass Production and Application

ResearchInChina has released the Embodied AI Robot Large Model (Including VLA) Research Report, 2026, which focuses on the research, analysis, and summary of the following content:

The basic concepts, industrial ecosystem map, multi-dimensional classification (application scope, capability modality, architecture), industry development drivers, key technology development directions, and commercialization modes of Embodied AI robot large models;

The layout planning, team building, core talents, large model products and their applications, detailed introduction and implementation status of Embodied AI robot large model products, Embodied AI ecosystem partners, and recent key dynamics of 11 tech giants in the Embodied AI robot field, including Alibaba Group, NVIDIA, Google DeepMind, OpenAI, Microsoft, Huawei, Tencent RoboticsX, Baidu, ByteDance, iFlytek, and SenseTime;

The profile, development history and planning, robot products and large model installation, detailed introduction of self-developed large models, large model ecosystem cooperation, and recent key dynamics of 10 well-known robot enterprises, including UBTECH Robotics, Unitree Robotics, AgiBot, Leju Robotics, Galbot, RobotEra, FigureAI, Sanctuary AI, 1X Technologies, and Neura Robotics;

The layout planning, team building, core talents, robot products and large model installation, summary of large model products, detailed introduction of Embodied AI robot large model products, Embodied AI ecosystem partners, and recent key dynamics of 11 OEMs in the Embodied AI robot field, including Tesla, Toyota, Honda, Hyundai, Xiaomi, XPeng, GAC Group, Chery, Leapmotor, BYD, and Dongfeng Motor. In addition, this report summarizes the layout of 13 other global OEMs in Embodied AI robot field.

Embodied AI robot large models ("robot large models" for short) can make end-to-end or hierarchical decisions compared with traditional robot control algorithms, without the need for precise modeling, and can operate in unstructured and open environments (families, outdoors, cluttered desktops). Compared with general large models, Embodied AI robot large models pay more attention to the fusion and understanding of multi-modal information (vision + lidar + touch + text, etc.), aiming to complete closed-loop actions in the physical world and output motion commands such as joint angles, speeds, and grasping forces.

In recent years, Embodied AI robot large model field has shown the following development trends:

1. Embodied AI Players Have Begun to Apply World Models

Currently, robot large models represented by Vision-Language-Action (VLA) models have made significant progress in the "perception-decision-execution" closed loop, enabling robots to understand instructions and generate actions. However, such models still face bottlenecks in coping with the high diversity and uncertainty of physical world. In essence, they are more like "imitating" patterns in training data, lacking the foresight of action consequences and the understanding of physical logic.

The introduction of world models is precisely to break this limitation. The core of a world model is to enable robots to acquire the ability to "imagine the future". Through training with multi-modal data, it constructs an internal dynamic representation of physical environment, and can predict state changes of multiple future steps based on current state and planned actions. This means that robots can transform from passive instruction followers to active decision-makers capable of "brain deduction". For example, when performing the "pouring water" task, a robot equipped with a world model can not only identify cups and kettles but also predict the water flow trajectory, cup tilt angle, and possible spills before action, thereby planning a safer and more accurate action sequence.

Driving forces for the application of world models mainly come from three aspects:

Solving the data bottleneck: The collection of high-quality real robot data is extremely costly and limited in scale, having become a core constraint on capability upgrading. World models can serve as powerful "data generators" and "simulation engines", generating massive, controllable, and high-fidelity synthetic training scenarios, and greatly reducing the reliance on expensive real robot data.

Improving decision and generalization capabilities: Through prediction and deduction, world models enable robots to have a certain degree of causal reasoning and physical intuition, capable of handling new scenarios and new objects not seen in training, and achieving "learning by analogy".

Realizing the collaborative evolution of "cerebrum" and "cerebellum": The industry consensus is that future robots' intelligence will be the result of collaborative evolution of the "cerebrum" (high-level cognition and planning) and the "cerebellum" (low-level motion control). As a key component of the high-level "cerebrum", the world model forms a complementary relationship with execution-oriented models such as VLA, jointly constituting a complete intelligent system.

Many enterprises have developed their own world models, such as Alibaba's WorldVLA, NVIDIA's WAM, Tencent Hunyuan 3D World Model, Unitree Robotics' UnifoLM-WMA-0, and AgiBot's GE-1. Among them, Unitree Robotics' UnifoLM-WMA-0 was released and open-sourced around September 2025. It is an open-source model specially designed for general robot learning. It has been adapted to the company's humanoid and quadruped robots, with two modes: decision and simulation. The decision mode can predict future physical interactions (such as stacking stability and collision risks), correct actions, and improve the robustness of complex tasks. The simulation mode can generate high-fidelity synthetic data to solve the problem of scarcity of real robot training data.

AgiBot's world model GE-1 was released in August 2025, which is a video-generative world model for robot control. With a closed-loop architecture of "video generation + strategy learning + simulation evaluation", it realizes end-to-end reasoning from "seeing" to "thinking" and then to "acting". GE-1 collaborates with AgiBot's GO-1 series base models: GO-1 focuses on general task planning and common-sense knowledge support, while GE-1 specializes in spatiotemporal prediction and action rehearsal, improving the task success rate and stability of G2 in complex scenarios.

GE-1 was officially deployed with the industrial-grade interactive embodied operation robot G2 in October 2025, and AgiBot announced that it had won an order worth hundreds of millions of yuan from Longcheer Technology. The robot has performed tasks such as "making sandwiches", "pouring tea", and "wiping the desktop".

2. Robot Large Models Achieve Cross-Platform Applications

In traditional robot development mode, the software and algorithms of each robot need to be specially developed and optimized for its unique hardware configuration (sensors, actuators, form), leading to high R&D costs, long cycles, and non-reusable capabilities. The cross-platform application of robot large models can break this drawback. By building a powerful end-to-end multi-modal foundation model, it implants transferable general intelligence into robots, enabling them to cross the limitations of different ontologies (such as humanoid, quadruped, robotic arm), different tasks and different environments, and realize rapid generalization and deployment of capabilities.

Starting from 2025, robot large models such as NVIDIA's GR00T series, Google DeepMind's Gemini Robotics, Microsoft's Rho-alpha, Huawei's CloudRobo, and RobotEra's ERA-42 all support cross-robot platform development and cross-scenario applications.

In Q3 2025, NVIDIA released the GR00T N1.6 large model, positioned as a general humanoid robot VLA large model. Through a unified multi-modal interface + modular adaptation layer + simulation-reality collaborative pipeline + hierarchical deployment architecture, it realizes the cross-platform application of "one training, multi-machine adaptation". It supports humanoid dual-arm/mobile robotic arms, warehouse AGVs, medical assistive robots, scientific research robots, etc. It can execute tasks for new objects/new scenarios without a large amount of data, and can be flexibly adapted to various application scenarios such as industrial manufacturing, logistics and warehousing, household and commercial services, medical care and health, and scientific research and development.

RobotEra's end-to-end VLA embodied large model ERA-42 was released in December 2024 and initially adapted to its dexterous hand XHAND1. In mid-2025, the model was successively applied across platforms to the wheeled service robot Q5 and the bipedal humanoid robot L7, enabling rapid adaptation to new tasks without pre-programming.

3. An Increasing Number of Robot Large Models Are Open-Sourced

The open-sourcing of large models is not a simple technical sharing. Open-source models gather the wisdom of global developers and can quickly overcome complex "long-tail problems" in the physical world. At the same time, open-sourcing breaks traditional closed-source business mode, allowing small and medium-sized enterprises to quickly develop based on open-source models, focus resources on hardware innovation and implementation in scenarios, and form an industrial pattern of "giants build the platform, and hundreds of enterprises perform on it".

The core of open-sourcing is to lower the R&D threshold, accelerate technological iteration, build ecosystem barriers, promote large-scale implementation, and form a positive flywheel of "open-source - ecosystem- data - more powerful models".

Xiaomi's VLA large model for Embodied AI robots, Xiaomi-Robotics-0, was officially open-sourced on February 12, 2026, adopting the Apache License 2.0 (allowing commercial use, modification, and distribution without "contagion") open-source protocol, with full-stack and unreserved open-sourcing of complete code, pre-trained weights, technical documents, papers, deployment solutions, etc. Xiaomi-Robotics-0 reuses Xiaomi's autonomous driving perception/decision technology to realize technology interoperability between robots and automobiles. It adopts a Mixture of Experts (MoE) architecture, separating the "cerebrum" (vision-language understanding) and the "cerebellum" (action execution). This design ameliorates the reasoning delay problem that may exist in traditional VLA models, making it more suitable for consumer robot products that require real-time response.

4. OEMs Enter the Market to Solve the Scarcity of Real Data for Embodied AI Robot Large Models and Provide Field Verification Scenarios

The entry of multiple OEMs into the Embodied AI and humanoid robot track brings massive industrial scenario data, automotive-grade sensor data and a mature autonomous driving technology stack to Embodied AI large models (VLA, world models, etc.). Algorithms such as BEV perception, multi-modal fusion, and end-to-end decision can be directly migrated to robots to train and improve environmental understanding, task planning and motion control capabilities of models. The production line scenarios of OEMs can verify the reliability and success rate of robot large models, expose model defects at the same time, provide high-reliable real robot interaction data for future model correction, and effectively narrow the large gap between simulation and reality.

In addition, OEMs introduce automotive-grade safety standards and hardware collaborative design into robots, greatly optimizing the reasoning delay, reliability and implementation efficiency of large models; the core supply chains of automobiles and robots (batteries, motors, sensors, domain controllers, etc.) have a high degree of overlap. Some institutions estimate that the overlap rate exceeds 50%. The scale effect greatly reduces the cost of core hardware, and the model deployment cost also decreases synchronously.

For example, to solve the data problem, GAC Group borrows the experience of autonomous driving data collection. It sends robots to real scenarios to collect real data, and at the same time carries out in-depth adaptation and field verification of core functions, forming a closed-loop data growth model of "learning by using, using by learning". In terms of cost reduction strategies, its robots first multiplexes vehicle components (such as chips, lidar, etc.) and realize 100% localization of key components. GAC has clearly planned to mass-produce its fourth-generation product GoMate Mini in 2027, taking the security scenario as the first commercial application field for its robots.

Table of Contents

1 Overview of Embodied AI Robot Large Models and Key Technology Development Directions

  • 1.1 Core Definitions of Embodied AI Robot Large Models
    • 1.1.1 Definition and Evolution of Embodied AI: Shifting from Weak Interaction to High Autonomy
    • 1.1.2 Definition of Embodied AI Robots: Autonomously Understanding the Environment and Completing Tasks via Artificial Intelligence
    • 1.1.3 Definition of Embodied AI Robot Large Models
  • 1.2 Global Industrial Ecosystem Map of Embodied AI Robot Large Models
  • 1.3 Classification of Embodied AI Robot Large Models
    • 1.3.1 By Application Scope
    • 1.3.2 By Capability Modality
    • 1.3.3 By Architectural Form
  • 1.4 Industry Development Drivers of Embodied AI Robot Large Models
    • 1.4.1 Overview
    • 1.4.2 Policies as the Core Engine
    • 1.4.3 Technology
    • 1.4.4 Market Demand
    • 1.4.5 Increased Capital Investment
    • 1.4.6 Industrial Collaboration
    • 1.4.7 Data Closed Loop Facilitates Model Iteration
    • 1.4.8 Aggregation of Interdisciplinary Talents
  • 1.5 Key Technology Development Directions of Embodied AI Robot Large Models
    • 1.5.1 Overview
    • 1.5.2 Multi-Modal Perception and Unified Representation
    • 1.5.3 World Model
    • 1.5.4 VLA End-to-End Architecture
    • 1.5.5 Hierarchical Fast-Slow System
    • 1.5.6 Enhancing Generalization Ability and Data Efficiency
    • 1.5.7 Safety and Reliability
    • 1.5.8 Lightweight and Edge Deployment
  • 1.6 Commercialization Modes of Embodied AI Robot Large Models
    • 1.6.1 Model Technology Output
    • 1.6.2 Integrated Hardware and Software Sales
    • 1.6.3 Scenario-Based Service Operation
    • 1.6.4 Data and Tool Ecosystem Services
    • 1.6.5 Key Strategies and Evolution Directions for Commercial Implementation of Embodied AI Robot Large Models

2 Global Major Players and Products: Tech Giant Camp

  • 2.1 Summary of Typical Embodied AI Large Model Products of Tech Giants (1)-(3)
  • 2.2 Alibaba Group
    • 2.2.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.2.2 Large Model R&D and Engineering Team: Tongyi Lab
    • 2.2.3 Establishment of the "Robot and Embodied AI Business Unit": Detailed Introduction
    • 2.2.4 Establishment of the "Robot and Embodied AI Business Unit": 2026-2028 Business Plan
    • 2.2.5 Core Team Members Their Resumes of Embodied AI Robot Large Models
    • 2.2.6 Large Model Product System
    • 2.2.7 Embodied AI Robot Large Models: Milestones in the Development
    • 2.2.8 Embodied AI Robot Large Models: Products Summary
    • 2.2.9 Embodied AI Robot Large Models: RynnBrain Series - The World's First Embodied AI Brain Foundation Model with Spatiotemporal Memory
    • 2.2.10 Embodied AI Robot Large Models: Flagship General Embodied Model RynnBrain30BA3B
    • 2.2.11 Embodied AI Robot Large Models: RynnVLA001
    • 2.2.12 Embodied AI Robot Large Models: RynnEC - Video Multi-Modal Embodied Cognition Model
    • 2.2.13 Embodied AI Robot Large Models: WorldVLA - Fully Autoregressive Embodied AI Large Model
    • 2.2.14 Embodied AI Robot Large Models: Summary of Implemented Robots
    • 2.2.15 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.3 NVIDIA
    • 2.3.1 Profile
    • 2.3.2 Industrial Layout History of Embodied AI Robots
    • 2.3.3 Core Team for Embodied AI Robots
    • 2.3.4 Summary of Embodied AI Robot-Related Products
    • 2.3.5 Embodied AI Robot Large Models: Development History
    • 2.3.6 Embodied AI Robot Large Models: Products Summary
    • 2.3.7 Embodied AI Robot Large Models: Isaac GR00T - VLA Large Model
    • 2.3.8 Embodied AI Robot Large Models: Dream Zero - World Action Model
    • 2.3.9 Embodied AI Robot Large Models: Implementation Status
    • 2.3.10 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.3.11 Embodied AI Robot Large Models: Key Dynamics
  • 2.4 Google DeepMind
    • 2.4.1 Core Team for Embodied AI Robots:Google DeepMind
    • 2.4.2 Profile
    • 2.4.3 Development History
    • 2.4.4 Core Research Directions: 10 Major Fields
    • 2.4.5 Core Team Members and Their Resumes
    • 2.4.6 Summary of Large Models
    • 2.4.7 Major Large Model: Gemini
    • 2.4.8 Embodied AI Robot Large Models: Gemini Robotics
  • 2.5 OpenAI
    • 2.5.1 Profile
    • 2.5.2 Financing History: Valuation Increased More Than 25 Times in Three Years
    • 2.5.3 Development History
    • 2.5.4 Organizational Structure
    • 2.5.5 Product Matrix
    • 2.5.6 Industrial Layout and Planning for Embodied AI Robots
    • 2.5.7 Core Team Members and Resumes of the Humanoid Robot Lab
    • 2.5.8 Embodied AI Robot Large Models: Products Summary
    • 2.5.9 Embodied AI Robot Large Models: GPT-5 Embodied Adaptation Version
    • 2.5.10 Embodied AI Robot Large Models: VLA Foundation Model
    • 2.5.11 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.6 Microsoft
    • 2.6.1 Industrial Layout History and Planning of Embodied AI Robots
    • 2.6.2 Team Setup for Embodied AI Robots
    • 2.6.3 Core Members and Their Resumes of the Embodied AI Team
    • 2.6.4 Summary of Self-Developed Large Model Products
    • 2.6.5 Embodied AI Robot Large Models: R&D History
    • 2.6.6 Embodied AI Robot Large Models: Rho-alpha - VLA+ Model
    • 2.6.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.6.8 Embodied AI Robot Large Models: Recent Key News and Dynamics
  • 2.7 Huawei
    • 2.7.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.7.2 Panoramic Table of Core Teams and Platforms for Embodied AI Robots
    • 2.7.3 Core Team Members and Resumes of the Embodied AI Special Task Group
    • 2.7.4 Overview of Pangu Large Model Products
    • 2.7.5 Pangu Large Model Capabilities: Multi-Modal Technology
    • 2.7.6 Pangu Large Model Capabilities: Reasoning Technology
    • 2.7.7 Pangu Large Model AI Cloud Services
    • 2.7.8 Embodied AI Robot Large Models: Cloud Robo
    • 2.7.9 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.8 Tencent RoboticsX
    • 2.8.1 Profile (1)-(2)
    • 2.8.2 Development History
    • 2.8.3 Embodied AI Robot Large Models: Products Summary
    • 2.8.4 Embodied AI Robot Large Models: Tairos-Perception
    • 2.8.5 Embodied AI Robot Large Models: Tairos-Planner
    • 2.8.6 Embodied AI Robot Large Models: Tairos-Action
    • 2.8.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.8.8 Embodied AI Robot Large Models: Key Dynamics
  • 2.9 Baidu
    • 2.9.1 Industrial Layout History and Planning for Embodied AI Robots
    • 2.9.2 Introduction to Teams for Embodied AI Robots
    • 2.9.3 Summary of Large Model Products
    • 2.9.4 Embodied AI Robot Large Models: ERNIE Embodied Control Model
    • 2.9.5 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.9.6 Embodied AI Robot Large Models: Key Dynamics
  • 2.10 ByteDance
    • 2.10.1 Industrial Layout History and Planning for Embodied AI Robots
    • 2.10.2 Introduction to Teams for Embodied AI Robots
    • 2.10.3 Core Team Members and Their Resumes of SeedRobotics for Embodied AI Robots
    • 2.10.4 Summary of Large Model Products
    • 2.10.5 Embodied AI Robot Large Models: Products Summary
    • 2.10.6 Embodied AI Robot Large Models: GR Series - Robot Cerebellum
    • 2.10.7 Embodied AI Robot Large Models: Robix - Robot Cerebrum
    • 2.10.8 Embodied AI Robot Large Models: M3-Agent - Multi-Modal Long-Term Memory
    • 2.10.9 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.11 iFlytek
    • 2.11.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.11.2 Related Teams/Enterprises for Embodied AI Robots
    • 2.11.3 Summary of Large Model Products
    • 2.11.4 Embodied AI Robot Large Models: Products Summary
    • 2.11.5 Embodied AI Robot Large Models: iFlyBot-VLM
    • 2.11.6 Embodied AI Robot Large Models: iFlyBot-VLA
    • 2.11.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.11.8 Embodied AI Robot Large Models: Key Dynamics
  • 2.12 SenseTime
    • 2.12.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.12.2 Related Teams/Enterprises for Embodied AI Robots
    • 2.12.3 Summary of Large Model Products
    • 2.12.4 Embodied AI Robot Large Models: Products Summary
    • 2.12.5 Embodied AI Robot Large Models: Wuneng Embodied AI Platform
    • 2.12.6 Embodied AI Robot Large Models: A1 - Embodied Super Brain
    • 2.12.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.12.8 Embodied AI Robot Large Models: Key Dynamics

3 Global Major Players and Products: Robot Enterprise Camp

  • 3.1 Summary of Typical Embodied AI Large Model Products of Robot Enterprises (1)-(3)
  • 3.2 UBTECH Robotics(UBTECH)
    • 3.2.1 Profile
    • 3.2.2 Revenue
    • 3.2.3 Overview of Robot Products
    • 3.2.4 Core Technology System
    • 3.2.5 Development Strategy and Planning
    • 3.2.6 Embodied AI Robot Large Models: BrainNet Architecture
    • 3.2.7 Layout of Embodied AI Robot Large Models
    • 3.2.8 Embodied AI Robot Large Models: Core Information of Three Major Models
    • 3.2.9 Embodied AI Robot Large Models: Development History of Thinker Multi-Modal Large Model
    • 3.2.10 Embodied AI Robot Large Models: Thinker - Multi-Modal Large Model for Humanoid Robots
    • 3.2.11 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.2.12 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.3 Unitree Robotics
    • 3.3.1 Profile
    • 3.3.2 Market and Product Strategic Planning
    • 3.3.3 Embodied AI Robot Large Models: Development History
    • 3.3.4 Embodied AI Robot Large Models: Self-Developed UnifoLM Series
    • 3.3.5 Embodied AI Robot Large Models: UnifoLM-WMA-0
    • 3.3.6 Embodied AI Robot Large Models: UnifoLM-VLA-0
    • 3.3.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.3.8 Embodied AI Robot Large Models: Details of Large Models Adapted to Robots
  • 3.4 AgiBot
    • 3.4.1 Profile
    • 3.4.2 Product Overview
    • 3.4.3 Embodied AI Robot Large Models: Five Self-Developed Core Models
    • 3.4.4 Embodied AI Robot Large Models: GO-1
    • 3.4.5 Embodied AI Robot Large Models: Guiguang Dongyu Large Model
    • 3.4.6 Embodied AI Robot Large Models: WorkGPT
    • 3.4.7 Embodied AI Robot Large Models: ActionGPT - Motion Large Model
    • 3.4.8 Embodied AI Robot Large Models: GE-1 - World Model
    • 3.4.9 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.4.10 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.4.11 "Project A"
  • 3.5 Leju Robotics
    • 3.5.1 Profile
    • 3.5.2 Development History
    • 3.5.3 Product Overview
    • 3.5.4 Development Strategy and Planning
    • 3.5.5 Embodied AI Robot Large Models: Development History
    • 3.5.6 Embodied AI Robot Large Models: Embodied AI Module (Self-Developed Multi-Modal by Leju) & Education Large Model
    • 3.5.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.5.8 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.5.9 Latest Dynamics
  • 3.6 Galbot
    • 3.6.1 Profile
    • 3.6.2 Core Team Members
    • 3.6.3 Product Overview
    • 3.6.4 Strategic Planning
    • 3.6.5 Embodied AI Robot Large Models: Summary of Self-Developed Large Models
    • 3.6.6 Embodied AI Robot Large Models: GraspVLA - Grasping Foundation Model
    • 3.6.7 Embodied AI Robot Large Models: Navigation Large Model
    • 3.6.8 Embodied AI Robot Large Models: GroceryVLA - Retail Scenario Large Model
    • 3.6.9 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.7 RobotEra
    • 3.7.1 Profile
    • 3.7.2 Overview of Robot Products
    • 3.7.3 Four Stages of Embodied AI Robot Large Model Exploration
    • 3.7.4 Embodied AI Robot Large Models: ERA-42
    • 3.7.5 Embodied AI Robot Large Models: CtrlWorld - Controllable Generation World Model
    • 3.7.6 Embodied AI Robot Large Models: Joining Two Top Industry-University-Research Alliances Simultaneously
    • 3.7.7 Embodied AI Robot Large Models: Joint Open-Sourcing of AIGC Robot Large Model with Tsinghua University
    • 3.7.8 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.8 FigureAI
    • 3.8.1 Profile
    • 3.8.2 Embodied AI Robot Large Models: Milestones in the Development
    • 3.8.3 Embodied AI Robot Large Models: Helix - End-to-End VLA General Embodied AI Model
    • 3.8.4 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.8.5 Embodied AI Robot Large Models: Industry Chain Partners
  • 3.9 Sanctuary AI
    • 3.9.1 Profile
    • 3.9.2 Core Team Members and Their Resumes
    • 3.9.3 Embodied AI Robot Large Models: Development History
    • 3.9.4 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.9.5 Embodied AI Robot Large Models: Summary of Self-developed Large Model
    • 3.9.6 Embodied AI Robot Large Models: Carbon(TM)v3
    • 3.9.7 Embodied AI Robot Large Models: LBM - Large Behavior Model
    • 3.9.8 Embodied AI Robot Large Models: Industry Chain Partners
  • 3.10 1X Technologies
    • 3.10.1 Profile
    • 3.10.2 Development History
    • 3.10.3 Core Team Members and Background
    • 3.10.4 Embodied AI Robot Large Models: R&D and Deployment History
    • 3.10.5 Embodied AI Robot Large Models: Summary of Self-Developed Large Models and Their Implementation Status
    • 3.10.6 Embodied AI Robot Large Models: Redwood AI
    • 3.10.7 Embodied AI Robot Large Models: 1X World Model
    • 3.10.8 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.11 Neura Robotics
    • 3.11.1 Profile
    • 3.11.2 Embodied AI Robot Large Models: Development History
    • 3.11.3 Embodied AI Robot Large Models: Summary of Self-Developed Model System
    • 3.11.4 Embodied AI Robot Large Models: NEFM
    • 3.11.5 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.11.6 Latest Dynamics: China Headquarters Settled in Xiaoshan, Hangzhou

4 Global Major Players and Products: Cross-Border OEMs Camp

  • 4.1 Summary of Typical Embodied AI Large Model Products of OEMs (1)-(4)
  • 4.2 Tesla
    • 4.2.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.2.2 Strategic Positioning in the Embodied AI Field
    • 4.2.3 Team Setup for Embodied AI Robots
    • 4.2.4 Core Team Members and Resumes of the Optimus Robot Team
    • 4.2.5 Embodied AI Robot Products and Large Model Deployment Status
    • 4.2.6 Embodied AI Robot Large Models: Summary of Large Model Products
    • 4.2.7 Embodied AI Robot Large Models: FSD - End-to-End Embodied Control Model
    • 4.2.8 Embodied AI Robot Large Models: Grok4 - Embodied Interaction Large Model
    • 4.2.9 Optimus Humanoid Robot Brain Adopting Dojo Supercomputer System
    • 4.2.10 Multiplexing FSD Software Algorithms for Robots
    • 4.2.11 AI Humanoid Robot Software Algorithms - Perception Algorithms
    • 4.2.12 AI Humanoid Robot Software Algorithms - Motion Planning
    • 4.2.13 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.2.14 Embodied AI Robot Large Models: Key Dynamics
  • 4.3 Toyota
    • 4.3.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.3.2 Related Teams/Companies for Embodied AI Robots
    • 4.3.3 Core Members and Their Resumes of the Research Institute
    • 4.3.4 Embodied AI Robot Products and Large Model Deployment Status
    • 4.3.5 Embodied AI Robot Large Models: Summary of Large Model
    • 4.3.6 Embodied AI Robot Large Models: LBM
    • 4.3.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.3.8 Embodied AI Robot Large Models: Key Dynamics
  • 4.4 Honda
    • 4.4.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.4.2 Related Teams/Companies for Embodied AI Robots
    • 4.4.3 Embodied AI Robot Products and Large Model Deployment Status
    • 4.4.4 Release of 2026 Core Technology Roadmap for Embodied Robots
    • 4.4.5 Overview of Embodied AI Robot Large Models
    • 4.4.6 Embodied AI Robot Large Models: Key Dynamics
  • 4.5 Hyundai
    • 4.5.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.5.2 AIRobotics Strategy: Partnering Human Progress
    • 4.5.3 Related Teams/Companies for Embodied AI Robots
    • 4.5.4 Holding a Controlling Stake in Boston Dynamics
    • 4.5.5 Boston Dynamics: Profile
    • 4.5.6 Boston Dynamics: Core Team Members and Resumes
    • 4.5.7 Embodied AI Robot Products and Large Model Deployment Status
    • 4.5.8 Embodied AI Robot Large Models: Overview of Large Model
    • 4.5.9 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.5.10 Embodied AI Robot Large Models: Key Dynamics
  • 4.6 Xiaomi
    • 4.6.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.6.2 Related Teams/Companies for Embodied AI Robots
    • 4.6.3 Panoramic View of Investment Ecosystem in the Embodied AI Robot Field
    • 4.6.4 Embodied AI Robot Products and Large Model Deployment Status
    • 4.6.5 Embodied AI Robot Large Models: Overview of Large Model
    • 4.6.6 Embodied AI Robot Large Models: Xiaomi-Robotics-0
    • 4.6.7 Embodied AI Robot: Self-Developed Software Algorithms
    • 4.6.8 Empowerment of Automotive Technology on Embodied AI Robots
    • 4.6.9 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.6.10 Embodied AI Robot Large Models: Key Dynamics
  • 4.7 XPeng
    • 4.7.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.7.2 Related Teams/Companies for Embodied AI Robots
    • 4.7.3 Core Talents and Their Resumes for Embodied AI Robots
    • 4.7.4 Product Iteration History and Large Model Deployment Status of Embodied AI Robots
    • 4.7.5 Embodied AI Robot Large Models: Summary of Large Model
    • 4.7.6 Embodied AI Robot Large Models: VLT - Robot-Specific Decision Large Model
    • 4.7.7 Embodied AI Robot Large Models: 2nd-Generation VLA Physical World Large Model
    • 4.7.8 Embodied AI Robot Large Models: VLM - Multi-Modal Interaction Large Model
    • 4.7.9 Multiplexing Automotive Algorithm Technology for Humanoid Robots
    • 4.7.10 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.7.11 Embodied AI Robot Large Models: Key Dynamics
  • 4.8 GAC Group
    • 4.8.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.8.2 Related Teams/Companies for Embodied AI Robots
    • 4.8.3 Establishment of Huilun Technology: Responsible for Core R&D, Production and Sales of Robots
    • 4.8.4 Huilun Technology: Core Members and Their Resumes
    • 4.8.5 Embodied AI Robot Products and Large Model Deployment Status
    • 4.8.6 Embodied AI Robot Large Models: Summary of Large Model
    • 4.8.7 Embodied AI Robot Large Models: GoMate - General Multi-Modal Large Model
    • 4.8.8 Embodied AI Robot Large Models: Embodied AI Motion Control Small Model
    • 4.8.9 Embodied AI Robot Large Models: GoMate Mini - Security Vertical Large Model
    • 4.8.10 Application of Autonomous Driving Technology on Humanoid Robots
    • 4.8.11 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.8.12 Embodied AI Robot Large Models: Summary of Key Dynamics
  • 4.9 Chery
    • 4.9.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.9.2 Related Teams/Companies for Embodied AI Robots
    • 4.9.3 Embodied AI Robot Products and Large Model Deployment Status
    • 4.9.4 Embodied AI Robot Large Models: Overview of Large Model
    • 4.9.5 Embodied AI Robot Large Models: Ecosystem Partners
  • 4.10 Leapmotor
    • 4.10.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.10.2 Related Teams/Companies for Embodied AI Robots
    • 4.10.3 Core Team Members and Their Resumes of Embodied AI Robot Team
    • 4.10.4 Embodied AI Robot Products and Large Model Deployment Status
    • 4.10.5 Embodied AI Robot Large Models: Summary and Planning of Large Model
    • 4.10.6 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.10.7 Embodied AI Robot Large Models: Key Dynamics
  • 4.11 BYD
    • 4.11.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.11.2 Related Teams/Companies for Embodied AI Robots
    • 4.11.3 Core Talents and Their Resumes for Embodied AI Robots
    • 4.11.4 Investment Ecosystem in the Embodied AI Robot Field
    • 4.11.5 Embodied AI Robot Products and Large Model Deployment Status
    • 4.11.6 Summary of Embodied AI Robot Large Models
    • 4.11.7 Embodied AI Robot Large Models: Key Dynamics
  • 4.12 Dongfeng Motor
    • 4.12.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.12.2 Related Teams/Companies for Embodied AI Robots
    • 4.12.3 Embodied AI Robot Products and Large Model Deployment Status
    • 4.12.4 Embodied AI Robot Large Models: Taiji Large Model
    • 4.12.5 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.12.6 Embodied AI Robot Large Models: Key Dynamics
  • 4.13 Summary of Global Other Main OEMs' Layout in the Embodied AI Robot Field (1)-(4)