Prox Industries Co., Ltd. and ugo Inc. are pushing further into Physical AI research through a joint development effort focused on autonomous robot operations using imitation learning and Vision-Language-Action models.
The companies demonstrated autonomous object manipulation tasks by fine-tuning the open-source VLA model ‘π0.5’ using ugo’s ‘Imitation Learning Kit for AI Robots’ and the domestically developed humanoid robot ugo Pro. The project combined motion data collected from both real robots and simulation environments to test how robots could adapt tasks based on surrounding conditions.
The demonstration showed a dual-arm robot which performed an entire packaging process by itself through the demonstration of its capabilities to open a paper bag with one arm while using its other arm to put a stuffed animal inside the bag.
こちらもお読みください: AVEVA、川崎重工業との関係を強化
The demonstration was showcased at the 2026 Humanoid Robot EXPO in Tokyo, where ウゴ Pro drew attention as one of the few domestically produced humanoid robots among a large number of overseas exhibitors.
The bigger focus here is adaptability. Physical AI companies are now trying to move robots beyond repetitive fixed automation into environments where machines can respond to real-world variations in objects, positioning, and workflow conditions. Prox Industries says future development will focus on industrial deployment by combining robotics, AI models, and operational data into real-world systems.


