Skip to main navigation menu Skip to main content Skip to site footer

Articles

Vol. 6 No. 1 (2019)

Directional PointNet: 3D Environmental Classification for Wearable Robots

  • Kuangen ZHANG
  • Jing WANG
  • Chenglong FU
Submitted
January 26, 2024
Published
2019-03-15

Abstract

A subject who wears a suitable robotic device will be able to walk in complex environments with the aid of environmental recognition schemes that provide reliable prior information of the human motion intent. Researchers have utilized 1D laser signals and 2D depth images to classify environments, but those approaches can face the problems of self-occlusion. In comparison, 3D point cloud is more appropriate for depicting the environments. This paper proposes a directional PointNet to directly classify the 3D point cloud. First, an inertial measurement unit (IMU) is used to offset the orientation of point cloud. Then the directional PointNet can accurately classify the daily commuted terrains, including level ground, climbing up stairways, and walking down stairs. A classification accuracy of 98% has been achieved in tests. Moreover, the directional PointNet is more efficient than the previously used PointNet because the T-net, which is utilized to estimate the transformation of the point cloud, is not used in the present approach, and the length of the global feature is optimized. The experimental results demonstrate that the directional PointNet can classify the environments in robust and efficient manner.

Downloads

Download data is not yet available.

Similar Articles

1 2 3 4 5 6 > >> 

You may also start an advanced similarity search for this article.