Once characterised by static, single-arm machines performing repetitive tasks, robotics is undergoing a profound transformation.
Today’s systems are evolving into embodied artificial intelligence (AI), with machines that can perceive, learn and interact with the physical world in far more autonomous and sophisticated ways. Examples range from autonomous delivery vehicles navigating city streets to dexterous humanoid robots handling complex logistics tasks such as truck loading and palletising.
The appeal of embodied AI lies in the advanced robots’ ability to make decisions and operate independently in dynamic environments, rather than merely executing pre-programmed instructions.
“Classically, robots required the environment to be heavily [defined] or engineered to eliminate as much uncertainty. Embodied AI takes a different view as the robots interact with the physical world in a more open or less structured environment,” says Professor Stephen Smith, president of the Association for the Advancement of AI, in an interview with The Edge Singapore at ATxSummit 2025. He contrasts a traditional factory robot, which is precisely aware of the item it handles and its position on a conveyor belt, with an autonomous vehicle (a form of embodied AI) that must constantly interpret and respond to unpredictable elements such as shifting traffic patterns and unplanned road obstacles.
Investors are taking note. According to Accenture, global funding for humanoid start-ups surged to US$1.1 billion ($1.4 billion) in 2024, up sharply from US$360 million the previous year. Market forecasts are equally bullish, with Dimension Market Research projecting the global embodied AI market to grow at a compound annual rate of about 15%, reaching US$8.7 billion by 2033. This momentum is driven by demand across sectors such as manufacturing, healthcare, logistics, and automotive — industries where intelligent autonomous machines can shoulder labour-intensive or hazardous tasks.
Case in point: California-based Dexterity raised US$95 million in March, bringing its valuation to US$1.65 billion. The start-up’s industrial robots are built to mimic human dexterity, enabling them to perform repetitive and potentially dangerous warehouse tasks like loading and unloading trucks or sorting and moving parcels. Its clients already include logistics giants Federal Express and United Parcel Service.
See also: OpenAI to expand in India with first office and hiring drive
The space industry could also benefit from embodied AI. Smith notes that in deep space missions (where long communication delays make ground control impractical and human crews may be absent), robotic autonomy becomes critical. “A team of robotic arms can be mounted on rail systems within spacecraft modules, capable of moving around, coordinating, and avoiding collisions while performing essential functions like maintenance and repair tasks [without human intervention],” he says.
Building common sense into robots
See also: AI agents expected to drive revenue by Apac CFOs, adoption rising in Southeast Asia
Scaling embodied AI remains challenging. Smith highlights that a primary hurdle lies in instilling human-like common sense into physical robots. Unlike digital AI, such as large language models (LLMs), that are trained on passive data and rely on statistical correlations, embodied AI systems must learn through direct interaction with the physical world, much like how children develop understanding through trial, error, and exploration. Those systems must experiment, fail, and recover and gradually build causal models of how the world works.
A further challenge lies in linking physically grounded, behavioural intelligence (such as balance and immediate reaction to stimuli, as seen in Boston Dynamics robots) with higher-level cognitive intelligence. Without this integration, robots may move like humans but still lack the judgment to act like them.
“To build a common sense model, an embodied AI robot needs to interact with the real world and keep actively learning through experimentation to allow its capabilities to evolve over time, just as children learn and grow. It’s a continual process, not one where you stop the robot to inject new knowledge for it to relearn,” says Smith.
Embodied AI also demands a far higher safety threshold than digital AI. “When we look at digital AI, like chatbots, the core requirement is usefulness, so we can tolerate their minor errors or odd phrasing. But that’s not enough for physical AI. Embodied AI robots can’t afford mistakes that could harm humans,” says Dexterity’s CEO, Samir Menon, at a panel discussion at ATxSummit 2025.
He explains that the development of embodied AI can be broadly broken down into three components. First is perception, which requires robots to see, touch, hear, and understand their surroundings. Second is reasoning, where they need to assess how to interact with various objects, anticipate reactions, and operate safely alongside humans. Finally, they must be able to move, a step that introduces the most risk and complexity, as errors can lead to physical consequences on humans.
Yet, training robots to navigate these demands safely is still an open challenge. The datasets required are incomplete or non-existent. While simulations can help accelerate early development by enabling “inspirational demonstrations” (or controlled scenarios that showcase potential), transferring those lessons into real-world embodied AI with reliable behaviour remains complex, notes Menon. That is why the current deployment of physical AI remains largely confined to structured industrial and logistics environments, where tasks and conditions can be tightly controlled.
Other considerations
To stay ahead of the latest tech trends, click here for DigitalEdge Section
Clear standards will be essential for embodied AI’s widespread adoption. “If I train an AI [software] on a robot with one type of hand, and then change the hand or switch to two arms, will it still work? Right now, we just don’t know. So, there must be standards ensuring AI trained on one robot type can transfer effectively to others,” says Menon.
Meanwhile, Smith underscores the importance of ethics as an evaluation metric. “We’re currently focused on performance or how well the embodied AI robots are doing the tasks, but we should consider other dimensions too, like ethics, because they operate within human environments, which have a large component of social norms.”
On the adopters’ side, the biggest challenge may be extending responsible AI practices from the digital realm into the physical world, according to Accenture’s Technology Vision 2025 report. While many companies have established frameworks for responsible AI governance, robotics introduces new and complex dimensions that existing policies do not fully address.
Take data privacy, for example. Existing laws around consent and the “right to be forgotten” are well established, but what happens when robots continuously collect audio-visual data to navigate public spaces? How do they comply with privacy laws on recording without consent or public filming? Storing data locally limits learning capabilities, but broader data collection risks exposing companies to legal liabilities.
Physical safety adds another layer of complexity. Beyond risks of hacking or malfunctions causing harm, questions arise about how robots should behave when operating as intended. For instance, a security robot confronting a breach or an emergency response robot on a construction site may face scenarios where harm is unavoidable. In these moments, the machines will be forced to make decisions that balance minimising damage and protecting human lives.
As such, governance frameworks that guide these decisions will be critical for enterprises deploying embodied AI. “Leaning on the established thinking and principles of responsible AI will be a good start, but leaders should be planning now for the unique situations their environment poses,” the Accenture report notes. It also stresses that responsible practices are essential to building human trust in working alongside embodied AI robots or machines.