Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2008 Aug 4;8(8):4505-4528.
doi: 10.3390/s8084505.

Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

Affiliations

Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

Martin Rutzinger et al. Sensors (Basel). .

Abstract

Airborne laser scanning (ALS) is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2) and additional classification variables from full-waveform (FWF) ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA) approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation) are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original measurements directly, i.e. the acquired points. Gridding of the data is not necessary, a process which is inherently coupled to loss of data and precision. The 3D properties provide especially a good separability of buildings and terrain points respectively, if they are occluded by vegetation.

Keywords: 3D feature calculation; Airborne laser scanning.; Classification; Error assessment; Full-waveform; Object-based point cloud analysis; Segmentation; Urban vegetation.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Study area (left) and nadir flight paths (right).
Figure 2.
Figure 2.
Workflow of object-based point cloud analysis for vegetation detection (a = amplitude, w = echo width, n = echo number, c = echo count, id = segment id, stats = segment statistics).
Figure 3.
Figure 3.
Derived classification trees for training site Rathauspark using either echo width or amplitude as additional feature. The error diagrams show the cross validated error plotted against classification tree size for changing complexity parameter settings.
Figure 4.
Figure 4.
Sample profiles showing reference, CTew cp=0.01, CTew cp=0.004, and CTampl colored by end nodes (=branches) for test site Rathauspark.
Figure 5.
Figure 5.
Sample profiles showing reference, CTew cp=0.01, CTew cp=0.004, and CTampl colored by end nodes (=branches) for the validation site Burggarten.
Figure 6.
Figure 6.
Comparison of classified vegetation points in reference, CTew cp=0.01, CTew cp=0.01 and CTampl for the three sites Rathauspark (a-d), Burggarten (e-h), and V olksgarten (i-l). In (a) areas with parking cars are labeled as 1 and 2. 3 and 4 are examples of buildings connected or covered by trees. In (e) areas with park fences and walls are labeled as 1 and 2. In (i) areas with short cut vegetation are labeled as 1, the fountain with grass islands as 2, and the ventilation shaft as 3.
Figure 7.
Figure 7.
Validation sites Burggarten and V olksgarten.

References

    1. Kraus K., Pfeifer N. Determination of terrain models in wooded areas with ariborne laser scanner data. ISPRS Journal of Photogrammetry and Remote Sensing. 1998;53:193–203.
    1. Sithole G., Vosselman G. Experimental comparison of filter algorithms for bare-earth extraction from airborne laser scanning point clouds. ISPRS Journal of Photogrammetry and Remote Sensing. 2004;59(1-2):85–101.
    1. Koukal T., Schneider W., editors. Proceedings of the International Workshop on 3D Remote Sensing in Forestry; Vienna, Austria. 2006.
    1. Rönnholm P., Hyyppä H., Hyyppä J., editors. Proceedings of the ISPRS Workshop 'Laser Scanning 2007 and SilviLaser 2007', volume 36 of International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences; Espoo, Finland. 2007.
    1. Rottensteiner F., Trinder J., Clode S., Kubik K. Building detection by fusion of airborne laser scanner data and multi-spectral images: Performance evaluation and sensitivity analysis. ISPRS Journal of Photogrammetry and Remote Sensing. 2007;62:135–149.

LinkOut - more resources