Classification of land cover and land use based on convolutional neural networks

verfasst von
Chun Yang, Franz Rottensteiner, Christian Heipke
Abstract

Land cover describes the physical material of the earth's surface, whereas land use describes the socio-economic function of a piece of land. Land use information is typically collected in geospatial databases. As such databases become outdated quickly, an automatic update process is required. This paper presents a new approach to determine land cover and to classify land use objects based on convolutional neural networks (CNN). The input data are aerial images and derived data such as digital surface models. Firstly, we apply a CNN to determine the land cover for each pixel of the input image. We compare different CNN structures, all of them based on an encoder-decoder structure for obtaining dense class predictions. Secondly, we propose a new CNN-based methodology for the prediction of the land use label of objects from a geospatial database. In this context, we present a strategy for generating image patches of identical size from the input data, which are classified by a CNN. Again, we compare different CNN architectures. Our experiments show that an overall accuracy of up to 85.7 % and 77.4 % can be achieved for land cover and land use, respectively. The classification of land cover has a positive contribution to the classification of the land use classification.

Organisationseinheit(en)
Institut für Photogrammetrie und Geoinformation
Typ
Konferenzaufsatz in Fachzeitschrift
Journal
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Band
4
Seiten
251-258
Anzahl der Seiten
8
ISSN
2194-9042
Publikationsdatum
23.04.2018
Publikationsstatus
Veröffentlicht
Peer-reviewed
Ja
ASJC Scopus Sachgebiete
Erdkunde und Planetologie (sonstige), Umweltwissenschaften (sonstige), Instrumentierung
Ziele für nachhaltige Entwicklung
SDG 15 – Lebensraum Land
Elektronische Version(en)
https://doi.org/10.5194/isprs-annals-IV-3-251-2018 (Zugang: Offen)
https://doi.org/10.15488/3436 (Zugang: Offen)