Classification of land cover and land use based on convolutional neural networks

authored by
Chun Yang, Franz Rottensteiner, Christian Heipke
Abstract

Land cover describes the physical material of the earth's surface, whereas land use describes the socio-economic function of a piece of land. Land use information is typically collected in geospatial databases. As such databases become outdated quickly, an automatic update process is required. This paper presents a new approach to determine land cover and to classify land use objects based on convolutional neural networks (CNN). The input data are aerial images and derived data such as digital surface models. Firstly, we apply a CNN to determine the land cover for each pixel of the input image. We compare different CNN structures, all of them based on an encoder-decoder structure for obtaining dense class predictions. Secondly, we propose a new CNN-based methodology for the prediction of the land use label of objects from a geospatial database. In this context, we present a strategy for generating image patches of identical size from the input data, which are classified by a CNN. Again, we compare different CNN architectures. Our experiments show that an overall accuracy of up to 85.7 % and 77.4 % can be achieved for land cover and land use, respectively. The classification of land cover has a positive contribution to the classification of the land use classification.

Organisation(s)
Institute of Photogrammetry and GeoInformation (IPI)
Type
Conference article
Journal
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume
4
Pages
251-258
No. of pages
8
ISSN
2194-9042
Publication date
23.04.2018
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Earth and Planetary Sciences (miscellaneous), Environmental Science (miscellaneous), Instrumentation
Sustainable Development Goals
SDG 15 - Life on Land
Electronic version(s)
https://doi.org/10.5194/isprs-annals-IV-3-251-2018 (Access: Open)
https://doi.org/10.15488/3436 (Access: Open)