Landslide identification and classification by object-based image analysis and fuzzy logic: An example from the Azdavay region (Kastamonu, Turkey)


AKSOY B., ERCANOĞLU M.

COMPUTERS & GEOSCIENCES, cilt.38, sa.1, ss.87-98, 2012 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 38 Sayı: 1
  • Basım Tarihi: 2012
  • Doi Numarası: 10.1016/j.cageo.2011.05.010
  • Dergi Adı: COMPUTERS & GEOSCIENCES
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.87-98
  • Hacettepe Üniversitesi Adresli: Evet

Özet

This study presents a data-driven and semiautomatic classification system carried out by object-based image analysis and fuzzy logic in a selected landslide-prone area in the Western Black Sea region of Turkey. In the first stage, a multiresolution segmentation process was performed using Landsat ETM+ satellite images of the study area. The model was established on 5235 image objects obtained by the segmentation process. A total of 70 landslide locations and 10 input parameters including normalized difference vegetation index, slope angle, curvature, brightness, mean band blue, asymmetry, shape index, length/width ratio, gray level co-occurrence matrix, and mean difference to infrared band were considered in the analyses. Membership functions were used to classify the study area by five fuzzy operators such as "and", "or", "mean arithmetic", "mean geometric", and "algebraic product". In order to assess the performances of the so-produced maps, 700 image objects, which were not used in the model, were taken into consideration. Based on the results, the map produced by "fuzzy and" operator performed better than those classified by the other fuzzy operators. The proposed methodology applied in this study may be useful for decision makers, local administrations, and scientists interested in landslides. It may also be useful in landslide-prone areas for planning, management, and regional development purposes. (C) 2011 Elsevier Ltd. All rights reserved.