Estimating the stand level vegetation structure map using drone optical imageries and LiDAR data based on an Artificial Neural Networks (ANNs)

Sungeun Cha, Hyun Woo Jo, Chul Hee Lim, Cholho Song, Sle Gee Lee, Jiwon Kim, Chiyoung Park, Seong Woo Jeon, Woo Kyun Lee

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Understanding the vegetation structure is important to manage forest resources for sustainable forest development. With the recent development of technology, it is possible to apply new technologies such as drones and deep learning to forests and use it to estimate the vegetation structure. In this study, the vegetation structure of Gongju, Samchuk, and Seoguipo area was identified by fusion of drone-optical images and LiDAR data using Artificial Neural Networks (ANNs) with the accuracy of 92.62% (Kappa value: 0.59), 91.57% (Kappa value: 0.53), and 86.00% (Kappa value: 0.63), respectively. The vegetation structure analysis technology using deep learning is expected to increase the performance of the model as the amount of information in the optical and LiDAR increases. In the future, if the model is developed with a high-complexity that can reflect various characteristics of vegetation and sufficient sampling, it would be a material that can be used as a reference data to Korea's policies and regulations by constructing a country-level vegetation structure map.

Original languageEnglish
Pages (from-to)653-666
Number of pages14
JournalKorean Journal of Remote Sensing
Volume36
Issue number5-11
DOIs
Publication statusPublished - 2020

Bibliographical note

Publisher Copyright:
© 2020 Korean Society of Remote Sensing. All rights reserved.

Keywords

  • Artificial Neural Networks (ANNs)
  • Drone image
  • Sustainable forest development
  • Vegetation structure

ASJC Scopus subject areas

  • Computers in Earth Sciences
  • Engineering (miscellaneous)
  • Earth and Planetary Sciences (miscellaneous)

Fingerprint

Dive into the research topics of 'Estimating the stand level vegetation structure map using drone optical imageries and LiDAR data based on an Artificial Neural Networks (ANNs)'. Together they form a unique fingerprint.

Cite this