TY - JOUR
T1 - A digital twin of intelligent robotic grasping based on single-loop-optimized differentiable architecture search and sim-real collaborative learning
AU - Jiao, Qing
AU - Hu, Weifei
AU - Hao, Guangbo
AU - Cheng, Jin
AU - Peng, Xiang
AU - Liu, Zhenyu
AU - Tan, Jianrong
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024.
PY - 2024
Y1 - 2024
N2 - The effectiveness of deep learning models for vision-based intelligent robotic grasping (IRG) tasks typically hinges upon the deep neural network (DNN) architecture as well as the task-oriented annotated training samples. Nevertheless, current methods applied for designing DNN architectures depend on human expertise or discrete search by evolution and reinforcement learning algorithms, which leads to enormous computational cost. Moreover, DNNs trained solely on simulation-labeled data face challenges in direct real-world deployment. In response to these concerns, this paper proposes a new stable and fast differentiable architecture search method (DARTS) based on a single-loop optimization framework, named single-loop-optimized DARTS (SLO-DARTS). This method enables simultaneous updates to the weights and architecture parameters of neural networks by continuously relaxing the discrete search space. Additionally, a digital twin (DT) framework integrating the Grasp-CycleGAN method is developed to minimize the visual gap between simulated and real-world IRG scenarios, enhancing the transferability of DNNs trained in simulation. The DT framework can not only enhance the IRG accuracy but also save the costly expense of large-scale real labeled data collection. Experiments demonstrate that the proposed SLO-DARTS method achieves a time-efficient optimization process while delivering a DNN with improved prediction accuracy compared to the original dual-loop-optimized DARTS method. The developed DT framework produces IRG accuracies of 92.6%, 86.3%, and 83.7% for single household objects, single adversarial objects, and cluttered objects, respectively.
AB - The effectiveness of deep learning models for vision-based intelligent robotic grasping (IRG) tasks typically hinges upon the deep neural network (DNN) architecture as well as the task-oriented annotated training samples. Nevertheless, current methods applied for designing DNN architectures depend on human expertise or discrete search by evolution and reinforcement learning algorithms, which leads to enormous computational cost. Moreover, DNNs trained solely on simulation-labeled data face challenges in direct real-world deployment. In response to these concerns, this paper proposes a new stable and fast differentiable architecture search method (DARTS) based on a single-loop optimization framework, named single-loop-optimized DARTS (SLO-DARTS). This method enables simultaneous updates to the weights and architecture parameters of neural networks by continuously relaxing the discrete search space. Additionally, a digital twin (DT) framework integrating the Grasp-CycleGAN method is developed to minimize the visual gap between simulated and real-world IRG scenarios, enhancing the transferability of DNNs trained in simulation. The DT framework can not only enhance the IRG accuracy but also save the costly expense of large-scale real labeled data collection. Experiments demonstrate that the proposed SLO-DARTS method achieves a time-efficient optimization process while delivering a DNN with improved prediction accuracy compared to the original dual-loop-optimized DARTS method. The developed DT framework produces IRG accuracies of 92.6%, 86.3%, and 83.7% for single household objects, single adversarial objects, and cluttered objects, respectively.
KW - Architecture search
KW - Automatic design
KW - Collaborative learning
KW - Deep neural network
KW - Digital twin
KW - Robotic grasping
UR - https://www.scopus.com/pages/publications/85206844871
U2 - 10.1007/s10845-024-02498-w
DO - 10.1007/s10845-024-02498-w
M3 - Article
AN - SCOPUS:85206844871
SN - 0956-5515
JO - Journal of Intelligent Manufacturing
JF - Journal of Intelligent Manufacturing
ER -