BRMPNet: bidirectional recurrent motion planning networks for generic robotic platforms in smart manufacturing
Bo-Han Feng , Bo-Yan Li , Xin-Ting Jiang , Qi Zhou , You-Yi Bi
Advances in Manufacturing ›› 2025, Vol. 13 ›› Issue (3) : 477 -492.
BRMPNet: bidirectional recurrent motion planning networks for generic robotic platforms in smart manufacturing
In the era of Industry 4.0, robot motion planning faces unprecedented challenges in adapting those high-dimension dynamic working environments with rigorous real-time planning requirements. Traditional sampling-based planning algorithms can find solutions in high-dimensional spaces but often struggle with achieving the balance among computational efficiency, real-time adaptability, and solution optimality. To overcome these challenges and unlock the full potential of robotic automation in smart manufacturing, we propose bidirectional recurrent motion planning network (BRMPNet). As an imitation learning-based approach for robot motion planning, it leverages deep neural networks to learn the heuristics for approximate-optimal path planning. BRMPNet employs the refined PointNet++ network to incorporate raw point-cloud information from depth sensors and generates paths with a bidirectional strategy using long short-term memory (LSTM) network. It can also be integrated with traditional sampling-based planning algorithms, offering theoretical assurance of the probabilistic completeness for solutions. To validate the effectiveness of BRMPNet, we conduct a series of experiments, benchmarking its performance against the state-of-the-art motion planning algorithms. These experiments are specifically designed to simulate common operations encountered within generic robotic platforms in smart manufacturing such as mobile robots and multi-joint robotic arms. The results demonstrate BRMPNet’s superior performance on key metrics including solution quality and computational efficiency, suggesting the promising potential of learning-based planning in addressing complex motion planning challenges.
Robot motion planning / Imitation learning / Deep neural network / Smart manufacturing / Adaptive and real-time planning
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
LaValle SM (1998) Rapidly-exploring random trees: a new tool for path planning. Tech Rep 98:11 |
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
Kumar R, Mandalika A, Choudhury S et al (2019) LEGO: leveraging experience in roadmap generation for sampling-based planning. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), Minneapolis, USA, pp 1488–1495 |
| [11] |
|
| [12] |
Pfeiffer M, Schaeuble M, Nieto J et al (2017) From perception to decision: a data-driven approach to end-to-end motion planning for autonomous ground robots. In: 2017 IEEE International conference on robotics and automation (ICRA), Singapore, pp 1527–1533 |
| [13] |
|
| [14] |
Bency MJ, Qureshi AH, Yip MC (2019) Neural path planning: fixed time, near-optimal path generation via oracle imitation. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), Macau, China, pp 3965–3972 |
| [15] |
Fishman A, Murali A, Eppner C et al (2022) Motion policy networks. In: Proceedings of the 6th conference on robot learning (CoRL), Auckland, New Zealand, pp 967–977 |
| [16] |
Kurutach T, Tamar A, Yang G et al (2018) Learning plannable representations with causal InfoGAN. In: Proceedings of the 32nd international conference on neural information processing systems, Red Hook, USA, pp 8747–8758 |
| [17] |
Huh J, Isler V, Lee DD (2021) Cost-to-go function generating networks for high dimensional motion planning. In: 2021 IEEE international conference on robotics and automation (ICRA), Barcelona, Spain, pp 8480–8486 |
| [18] |
Oh J, Singh S, Lee H (2017) Value prediction network. In: Proceedings of the 31st conference on neural information processing systems (NIPS), Long Beach, USA, pp 6118–6128 |
| [19] |
|
| [20] |
Strudel R, Pinel RG, Carpentier J et al (2021) Learning obstacle representations for neural motion planning. In: Proceedings of the 2020 conference on robot learning (CoRL), pp 355–364 |
| [21] |
Khan A, Ribeiro A, Kumar V et al (2020) Graph neural networks for motion planning. arXiv 2006.06248 |
| [22] |
|
| [23] |
|
| [24] |
Qureshi AH, Yip MC (2018) Deeply informed neural sampling for robot motion planning. In: 2018 IEEE/RSJ International conference on intelligent robots and systems (IROS), Madrid, Spain, pp 6582–6588 |
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
Chase KJ, Ichter B, Bandari M et al (2020) Neural collision clearance estimator for batched motion planning. In: 2020 International workshop on the algorithmic foundations of robotics (WAFR), Oulu, Finland, pp 73–89 |
| [29] |
Zhang C, Huh J, Lee DD (2018) Learning implicit sampling distributions for motion planning. In: 2018 IEEE/RSJ International conference on intelligent robots and systems (IROS), Madrid, Spain, pp 3654–3661 |
| [30] |
Tran T, Denny J, Ekenna C (2020) Predicting sample collision with neural networks. arXiv 2006.16868 |
| [31] |
|
| [32] |
Qi CR, Yi L, Su H et al (2017) PointNet++: deep hierarchical feature learning on point sets in a metric space. In: Proceedings of the 31st international conference on neural information processing systems (NIPS), Red Hook, USA, pp 5105–5114 |
| [33] |
Zhou Y, Tuzel O (2017) VoxelNet: end-to-end learning for point cloud based 3d object detection. In: 2018 IEEE/CVF Conference on computer vision and pattern recognition (CVPR), Salt Lake City, UT, USA, pp 4490–4499 |
| [34] |
Charles RQ, Su H, Kaichun M et al (2017) PointNet: deep learning on point sets for 3D classification and segmentation. In: 2017 IEEE Conference on computer vision and pattern recognition (CVPR), Honolulu, HI, USA, pp 77–85 |
| [35] |
Kuffner JJ, LaValle SM (2000) RRT-connect: an efficient approach to single-query path planning. In: Proceedings of 2000 IEEE international conference on robotics and automation (ICRA), San Francisco, USA, pp 995–1001 |
| [36] |
|
| [37] |
|
Shanghai University and Periodicals Agency of Shanghai University and Springer-Verlag GmbH Germany, part of Springer Nature
/
| 〈 |
|
〉 |