Inferring surface geometry from point clouds for next best view planning
Share
- Publication Date
- Abstract
The next best view planning approach presented in this work is designed for mapping large scale environments in real time using a system that is model-free, reconstruction-free and has a continuous spatial representation. Observations from a 3D sensor are combined into a point cloud representation of the environment. Density is computed locally over the pointcloud to define frontiers and the surface geometry of the observed scene is estimated at these frontiers. Viewpoint proposals are obtained for each frontier based on the estimated surface geometry. The system terminates when the scene has been completely observed (i.e., there are no more frontiers).
- Publication Details
- Type
- Abstract-Refereed Conference Paper
- Conference
- Joint Industry and Robotics Centres for Doctoral Training (CDTs) Symposium (JIRCS)
- Location
- Oxford, UK
- Manuscript
- Google Scholar
- Google Scholar
- BibTeX Entry
@inproceedings{border_jircs17,
author = {Rowan Border and Jonathan D Gammell and Paul Newman},
title = {Inferring surface geometry from point clouds for next best view planning},
booktitle = {Joint Industry and Robotics Centres for Doctoral Training ({CDTs}) Symposium ({JIRCS})},
year = {2017},
address = {Oxford, UK},
month = {7 } # jun,
}