A Comparative Study of Deep Learning Models on Tropospheric Ozone Forecasting Using Feature Engineering Approach

Creative Commons License

REZAEİ R., Naderalvojoud B., GÜLLÜ G.

Atmosphere, vol.14, no.2, 2023 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 14 Issue: 2
  • Publication Date: 2023
  • Doi Number: 10.3390/atmos14020239
  • Journal Name: Atmosphere
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, CAB Abstracts, Compendex, Geobase, INSPEC, Veterinary Science Database, Directory of Open Access Journals
  • Keywords: air pollution, feature engineering, ozone forecasting, deep neural network
  • Hacettepe University Affiliated: Yes


This paper investigates the effect of the architectural design of deep learning models in combination with a feature engineering approach considering the temporal variation in the features in the case of tropospheric ozone forecasting. Although deep neural network models have shown successful results by extracting features automatically from raw data, their performance in the domain of air quality forecasting is influenced by different feature analysis approaches and model architectures. This paper proposes a simple but effective analysis of tropospheric ozone time series data that can reveal temporal phases of the ozone evolution process and assist neural network models to reflect these temporal variations. We demonstrate that addressing the ozone evolution phases when developing the model architecture improves the performance of deep neural network models. As a result, we evaluated our approach on the CNN model and showed that not only does it improve the performance of the CNN model, but also that the CNN model in combination with our approach boosts the performance of the other deep neural network models such as LSTM. The development of the CNN, LSTM-CNN, and CNN-LSTM models using the proposed approach improved the prediction performance of the models by 3.58%, 1.68%, and 3.37%, respectively.