Next Article in Journal
Anthropogenic Transformations in the Mouth Area of Tributaries as Factors of Negative Impact on Lake Baikal
Next Article in Special Issue
Examining the Applicability of Wavelet Packet Decomposition on Different Forecasting Models in Annual Rainfall Prediction
Previous Article in Journal
Otolith Fingerprints and Tissue Stable Isotope Information Enable Allocation of Juvenile Fishes to Different Nursery Areas
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Neural Networks for Air Temperature Forecasting

1
Department of Civil and Environmental Engineering and Water Resources Research Center, University of Hawaii at Manoa, Honolulu, HI 96822, USA
2
Department of Environmental Engineering, Gyeongsang National University, Jinju 52725, Korea
*
Author to whom correspondence should be addressed.
Water 2021, 13(9), 1294; https://doi.org/10.3390/w13091294
Submission received: 17 March 2021 / Revised: 27 April 2021 / Accepted: 2 May 2021 / Published: 4 May 2021

Abstract

:
The accurate forecast of air temperature plays an important role in water resources management, land–atmosphere interaction, and agriculture. However, it is difficult to accurately predict air temperature due to its non-linear and chaotic nature. Several deep learning techniques have been proposed over the last few decades to forecast air temperature. This study provides a comprehensive review of artificial neural network (ANN)-based approaches (such as recurrent neural network (RNN), long short-term memory (LSTM), etc.), which were used to forecast air temperature. The focus is on the works during 2005–2020. The review shows that the neural network models can be employed as promising tools to forecast air temperature. Although the ANN-based approaches have been utilized widely to predict air temperature due to their fast computing speed and ability to deal with complex problems, no consensus yet exists on the best existing method. Additionally, it is found that the ANN methods are mainly viable for short-term air temperature forecasting. Finally, some future directions and recommendations are presented.

1. Introduction

Global warming has recently drawn scientists’ attention since it is correlated with the rise in air temperature. Increasing air temperature leads to changes in climatic conditions, such as sea-level rise, growth of extreme events, and global warming, ultimately negatively impacting humans’ lives [1]. Air temperature is the state variable of the atmosphere and affects atmospheric and land surface processes [2,3,4]. Forecasting air temperature is an important part of weather prediction because it is used to protect human lives and properties. People may suffer potential health problems when the air temperature is not in a suitable range [5,6]. Extreme changes in air temperature may cause damage to plants and animals. The accurate forecast of air temperature is essential due to its significant effect on various sectors, such as industry, energy, and agriculture [7,8]. Reliable air temperature predictions increase the accuracy of energy consumption [9]. Air temperature is also one of the key factors in predicting other meteorological variables, such as streamflow [10], evapotranspiration [11], and solar radiation [12]. Therefore, finding an appropriate approach for the prediction of air temperature is vital and may mitigate the consequences of global warming and climate change. Furthermore, the accurate prediction of air temperature plays an important role in establishing a plan for human activities, energy policy, and business development [13].
Recently, models based on artificial neural networks (ANNs) have attracted scientists’ attention in various disciplines, such as meteorology, water resources, and hydrology, because of their capability in capturing nonlinear relationships between inputs and outputs. Various ANNs-based approaches performed successfully in many hydrologic problems, such as flood [14], rainfall [15], water quality [16], and air temperature [17] predictions. Inspired by the biological nervous systems, ANNs are powerful tools for modeling nonlinear relations between dependent and independent variables. Generalization is one of the capabilities of ANNs, allowing them to predict patterns that were not provided to them during training. As a result, ANN forecasting models are able to provide a more promising performance than physical and statistical approaches. They are also easily accessible in commonly used programming environments (e.g., Matlab, Python, etc.) as a toolbox.
Different types of ANNs (e.g., multi-layer perceptron (MLP), recurrent neural network (RNN), long short-term memory (LSTM), convolutional neural network (CNN), etc.) have been utilized to forecast air temperature [18]. Each type has its unique structure to learn the air temperature patterns and forecast them. However, accurate air temperature forecasting has remained a major challenge (especially when the forecast time horizon increases) for many decades due to the chaotic and complex nature of air temperature data.
This paper provides a review of neural network (NN) models for air temperature forecasting. We focused on the recent studies during the last 15 years. This review paper also identifies new research problems arising from the published literature. To the best of our knowledge, this is the first review paper on the application of neural network-based techniques in predicting air temperature. In total, 26 studies that used different kinds of neural networks, such as MLP, generalized feed forward neural network (GFFNN), modular neural network (MNN), RNN, and LSTM, to predict air temperature are discussed. The review of neural network methodologies and their performance will encourage researchers to utilize these techniques to forecast air temperature.

2. ANN Inputs

This work focuses on the widely used neural network approaches (e.g., MLP, RNN, and LSTM) in air temperature prediction. Different studies have used various input variables as they can significantly impact the performance of models. In a number of studies (e.g., Chattopadhyay et al. [19], Ustaoglu et al. [20]), air temperature was predicted based on the historical air temperature data by accounting for time lags (the so-called univariate model). Another common approach is to use other relevant climatic variables (e.g., rainfall, air humidity, wind speed, air pressure, etc.) as inputs to forecast air temperature (the so-called multivariate model) [21,22]. Therefore, the ANN models can be categorized into two groupings: the first group uses only the historical air temperature measurements as inputs, and the second group employs air temperature and other relevant hydrologic variables.

3. Artificial Neural Networks (ANNs)

ANNs are a class of artificial intelligence, which work by imitating the biological structure of the human brain. In this section, three commonly used types of ANNs (i.e., MLP, RNN, and LSTM) are described. For a detailed description of radial basis function (RBF), modular neural network (MNN), ward-style ANN, convolutional recurrent neural network (CRNN), convolutional long short-term memory (ConvLSTM), generalized regression neural network (GRNN), and convolution neural network (CNN), the readers are referred to Ustaoglu et al. [20], Chattopadhyay et al. [19], Smith et al. [22], Zhang et al. [23], Kreuzer et al. [24], Kreuzer et al. [24], and Lee et al. [25], respectively.

3.1. Multilayer Perceptron (MLP)

The MLP is a feed-forward ANN, which has been used widely for air temperature prediction [19,26]. The MLP is composed of an input layer, one or more hidden layers, and an output layer [27]. The basic processing elements of the MLP are interconnected neurons or nodes, which are connected by adaptable weights (Figure 1). Each neuron receives input signals from the outputs of other neurons. The output of a neuron is a function of the weighted input, bias, and activation function [28]:
y = f ( i = 1 n w i x i + b )
where y is an output from the neuron, x i is the ith input to the neuron, w i is the connection weight of the ith input, b is the bias, and f is the activation function.
During the training process, all weights and biases are adjusted by a learning algorithm to minimize the forecasting error of networks. Then, the validation process is employed to evaluate the performance of the neural network [17].

3.2. Recurrent Neural Network (RNN)

RNN is a class of ANNs developed for processing sequential data [29]. Unlike the traditional ANNs, RNN has recurrent layers in which neurons are connected (Figure 2). Hence, information from a neuron is transferred to the neurons in the same and next layers. As seen in Figure 2, RNN also has a hidden state to recall some sequence data. RNN computes new states by applying its activation functions to prior states and new inputs recursively. The hidden state value ( h t ) at a time step t can be obtained via:
h t = f ( w x x t + u h h t 1 + b )
where x t , h t 1 , w x and u h are the inputs at time t, hidden states of the previous step ( t 1 ), weight for the input, and weight for the previous state value, respectively. Additionally, b is the bias and f is the activation function applied to the hidden state of current time.
RNN is convenient for processing time series as it is able to model the temporal dynamics in the sequence of data by the feedback connections, which transmit information from the previous input to the next one. However, a shallow or simple RNN often encounters the vanishing gradient problem [30]. Therefore, it cannot model the long-term temporal patterns and make the network weak. In recent years, the gradient vanishing problem in RNN has been resolved by the long short-term memory (LSTM) neural network, which has greater computational cost.

3.3. Long Short-Term Memory (LSTM)

LSTM was first presented by Hochreiter [31]. LSTM is a class of RNN, which was developed for learning long-term dependencies. Each neuron in LSTM is a memory cell, which includes three gates: input gate, forget gate, and output gate to control the flow of information between different time steps (Figure 3). Unlike conventional ANNs, the LSTM cells generate two separate values by a series of activations and operations. One value is the cell state ( c t ) that carries information and stores memory in the long term, and the other is the output of the hidden layer ( s t ) . When the number of inputs increases, the gradients to the first several inputs vanish and become equal to zero. The LSTM can solve this problem by using the internal gates that can add, edit, or remove information in the cell. The readers are referred to Tran et al. [32] for a detailed description of LSTM.

4. Related Work

Herein, we provide a summary of studies, which adopted neural network models to forecast air temperature for a few minutes to several months ahead (see Table 1). The focus is on reviewing the papers published during the last 15 years (2005–2020). The reviewed studies are categorized based on their inputs into the univariate and multivariate models.

4.1. Univariate Models

Ustaoglu et al. [20] employed three distinct ANNs namely, feed-forward back propagation (FFBP), radial basis function (RBF), and generalized regression neural network (GRNN), to forecast daily mean, maximum, and minimum air temperature in Turkey. The models used daily air temperature measurements of the previous seven days to forecast 1-day-ahead air temperature. Using the correlation coefficient ( R 2 ), root mean square error (RMSE), and index of agreement (IA) statistical metrics, they showed that all the utilized neural network methods produced satisfactory results. Additionally, air temperature predictions from the ANN models were compared to those of the multiple linear regression (MRL) approaches. The ANN methods were found slightly superior to the MLR models.
Chattopadhyay et al. [19] applied three types of ANNs (multilayer perceptron (MLP), generalized feed forward neural network (GFFNN), and modular neural network (MNN)) to predict monthly maximum air temperature across the northeast of India. The periodicity of 12 months was found in the monthly maximum air temperature time series, and therefore a multiplicative model was used to deseasonalize the data. Additionally, the increasing trend in time series was identified by both the Mann–Kendall non-parametric and parametric tests. A trend equation was fitted to remove the trend from the deseasonalized time series. Consequently, the monthly maximum air temperature time series was found to be stationary. This allowed the networks to perform more efficiently. In their study, maximum air temperature values in a number of previous months (ranging from 2 to 4) were used as inputs to the neural networks. It was found that the MNN model using air temperature measurements in the previous four months performed better than MLP and GFFNN.
Abhishek et al. [33] investigated the feasibility of the feed-forward neural network (FFNN) for predicting daily maximum air temperature in Canada from 1999–2009. The input data consisted of daily maximum air temperature measurements in the past 10 years. Different transfer functions, number of hidden layers, and neurons were tested to evaluate the performance of neural networks. Finally, the results showed that the ANN with 5 hidden layers, 10 neurons per layer, and a tan-sigmoid transfer function generated the best maximum air temperature predictions.
Kumar et al. [34] used FFNN to forecast weekly mean air temperatures in India. Air temperature data in the previous six weeks were used in various ANN architectures to predict 1-week ahead air temperature. The predictive ability of different configurations was assessed by computing R 2 and RMSE metrics. Finally, a two-hidden-layer model with five neurons in each layer was found to produce the best results.
Optimizing hyperparameters of ANNs improves their ability to forecast hydrologic variables [49,50]. Tran et al. [32] employed a genetic algorithm (GA) to optimize hyperparameters of conventional multilayer ANN, RNN, and LSTM models. The hybrid models were used to forecast maximum air temperature at the Cheongju station in South Korea. Air temperature observations in the last seven days were used as inputs to forecast 1- to 15-days-ahead maximum air temperature. The results showed that the hybrid GA-LSTM had a better performance than the other models for long-term air temperature forecasting.
In another effort, Tran and Lee [35] applied the traditional multilayer ANN models to predict 1-day-ahead maximum air temperature at 55 stations in South Korea. They tried various numbers of parameters (i.e., the total number of weights and bias) by using different numbers of neurons and hidden layers. It was found that the ANN model with 5 hidden layers and a total of 49 weights and biases generated the smallest error at 52 stations in South Korea.
Other studies used more complex deep learning architectures for air temperature forecasting. For example, Zhang et al. [23] forecasted daily average air temperature for 4 days ahead by a convolutional recurrent neural network (CRNN), which combined convolutional neural networks (CNNs) with recurrent neural networks (RNNs). They utilized daily air temperature data over China from 1952 to 2018 to train the CRNN. The results demonstrated that their model could predict air temperature successfully based on the previous air temperature data.
Li et al. [23] employed a stacked long short-term memory network (stacked LSTM) to predict half-hourly air temperature from its historical observations. The proposed LSTM model had three hidden layers with 20, 10, and 4 memory cells in each layer. The fully connected layer and output layer had four and one neurons, respectively. Finally, the LSTM model was compared with the deep neural network (DNN) and random forest (RF) approaches under different sliding windows. It was observed that the network built by stacked LSTM is superior to the DNN and RF methods.
Afzali et al. [37] developed two different types of neural networks (namely, FFNN and Elman neural network) to predict 1-day-ahead mean, minimum, and maximum air temperature in the Kerman city (Iran) from the corresponding values in the last 15 days. The results showed that both neural networks provided satisfactory air temperature predictions. Additionally, the Elman network generated better forecasts.
De and Debnath [38] employed the FFNN model to forecast the air temperature of the monsoon months (June, July, and August) in India for 1901–2003. In their study, the monthly mean air temperatures in December, January, February, March, April, and May were used as inputs.

4.2. Multivariate Models

Smith et al. [22] used ANN models to forecast hourly air temperature for 1–12 steps ahead. The inputs consisted of air temperature, relative humidity, wind speed, solar radiation, and rainfall measurements in the previous 24 h. The data from 2001 to 2005 in the southern and central regions of Georgia were used to train and test the networks. The models used a linear input layer, and three equally sized parallel “slabs” using the Gaussian, Gaussian complement, and hyperbolic tangent activation functions in the hidden layer. The number of hidden nodes varied from 2 to 75 nodes. The results showed that the model with 40 nodes in the hidden layer produced the most accurate predictions.
Smith et al. [39] forecasted air temperature for 1–12 h ahead by the Ward-style ANN model. Hourly air temperature, wind speed, relative humidity, solar radiation, and rainfall observations as well as their hourly rate of change in the last 24 h were used as inputs. The data were recorded by the Georgia Automated Environmental Monitoring Network (AEMN) during 1997–2005. The temperature prediction models had a single hidden layer with 120 nodes that were distributed equally among the three slabs. The MAE of the evaluation set (2004–2005) ranged from 0.516 °C for the 1-h horizon to 1.873 °C for the 12-h horizon prediction. Additionally, two ensemble techniques (parallel and series aggregations) were investigated and found to be infeasible for air temperature prediction.
Altan Dombayci and Gölcü [17] employed the MLP neural network with Levenberg–Marquardt (LM) feed-forward backpropagation algorithms to predict daily mean air temperature in Turkey for one day ahead. The model was trained and tested by the data in 2003–2005 and 2006, respectively. The inputs of the network were the month of the year, the day of the month, and mean temperature of the previous day. The number of hidden neurons was varied from 3 to 30, and the network with 6 hidden neurons produced the best result.
Many studies utilized deep learning networks and geographical information to predict air temperature. Bilgili and Sahin [28] used three geographical variables (latitude, longitude, and altitude) and the number of months (1, 2, …, 12) as the inputs of the ANN model to predict monthly air temperature and rainfall in Turkey. The data from 76 weather stations between 1975 and 2006 were used to train and test the model. They showed that the ANN approach can predict monthly temperature and rainfall fairly well using the geographical variables and number of months.
Kisi and Shiri [40] used the number of months (1–12) and geographical information (latitude, longitude, and altitude) in ANN and the adaptive neuro-fuzzy inference System (ANFIS) to predict monthly average air temperature at 30 sites in Iran. Their robustness was compared by the RMSE, MAE, and coefficient of determination ( R 2 ) metrics. The results showed that the performance of ANN was better than that of ANFIS in most stations.
The geographical variables (latitude, longitude, and altitude) along with the month of the year (1–12) were fed into the feed-forward network (FFN), ANFIS, support vector regression (SVR), and gene expression programming (GEP) models to predict monthly mean air temperatures at 50 stations in Iran by Kisi and Sanikhani [41]). The data of 30, 10, and 10 stations were selected for training, validation, and testing the models. SVR had the best performance followed by ANFIS and FFN.
Şahin [42] used the urban heat island (UHI) effect, number of months (1–12), altitude, latitude, longitude, and monthly mean land surface temperatures of 20 cities in Turkey into the three-layer FFN to predict monthly mean air temperature. The monthly data from 1995 to 2004 were used to train the FFN model, while the data of 2005 were used to test it. In their study, the number of hidden neurons was increased from 1 to 50 to find the optimized neural network. In the test period, the RMSE of monthly mean air temperature predictions at the 20 investigated cities ranged from 0.705 to 2.600 K.
Salcedo-Sanz et al. [26] compared the performance of SVR and MLP for predicting monthly mean air temperature at 10 sites in Australia and New Zealand. Air temperature from the previous month, two dummy variables d 1 = sin ( 2 π n 12 ) and d 2 = cos ( 2 π n 12 ) (where n = 0 , 1 , , 11 depending on the month of the year), Southern Oscillation Index (SOI), Indian Ocean Dipole (IOD), and Pacific Decadal Oscillation (PDO) were used as inputs [51,52]. The results showed that SVR was able to provide more accurate predictions than MLP.
Akram and El [43] applied a deep LSTM network to forecast air temperature, humidity, and wind speed for 24 (or 72) h ahead in 9 cities of Morocco using the 24 (or 72) previous hourly values of air temperature, humidity, and wind speed as inputs. The model had a fully connected hidden layer (with 100 neurons) between two LSTM layers. The results showed that the proposed LSTM model could predict weather variables with high accuracy.
Jallal et al. [44] used an MLP model to predict air temperature in Morocco for 30 min ahead from the three previous half-hourly air temperature and global solar radiation measurements. They changed the number of hidden layers (from 1 to 5) and neurons (from 1 to 15) as well as activation functions (radial basis activation function, logistic sigmoid function, and hyperbolic tangent function) to find the best configuration. It was found that a two-hidden-layer network that used the hyperbolic tangent function with 5 and 8 hidden nodes in each layer respectively generated the best predictions with the MSE of 0.272 °C and R 2 of 0.997.
Park et al. [45] applied an LSTM model to forecast air temperature at three locations in South Korea. Wind speed, air temperature, and humidity were employed as inputs. The LSTM model with four layers could predict air temperature accurately for both short (6, 12, and 24 h ahead) and long (14 days in advance) periods. The results showed that the LSTM approach outperformed the deep neural network (DNN).
Huang et al. [46] utilized the RNN model to forecast daily maximum and minimum air temperature at 14 sites in Guangxi, China. Based on the climatology and persistence (CLIPER) method [53], the average, maximum, and minimum air temperature, and precipitation in the previous days, as well as a total of 50 CLIPER predictors were selected for temperature prediction. The performance of the RNN model was compared with the stepwise regression method. It was found that the accuracy of RNN was higher than that of the stepwise regression method.
Sundaram et al. [47] compared the performance of three machine learning models namely, support vector machine (SVM), MLP, and RNN for daily air temperature prediction. Different meteorological variables, such as air temperature, atmospheric pressure, relative humidity, wind direction, total cloud cover, horizontal visibility, and dew point temperature, were inputted into the abovementioned models. The RMSE of air temperature forecasts from RNN is 1.41 °C, which is lower than the RMSEs of 3.1 °C and 6.67 °C from MLP and SVM, respectively.
Roy [48] explored three deep neural networks namely, MLP, LSTM, and hybrid CNN-LSTM, to forecast the air temperature for 1–10 days ahead. The past seven days of wind speed, precipitation, snow depth, and mean, maximum, and minimum temperature were used as inputs. The results indicated that the hybrid CNN-LSTM model outperformed the other models.
Kreuzer et al. [24] used the convolutional long short-term memory (convLSTM) method to forecast air temperature up to 24 h in advance in five weather stations of Germany during 2009–2018. They compared the performance of convLSTM with those of the seasonal autoregressive integrated moving average (SARIMA), seasonal naive approach, and univariate and multivariate LSTMs. Hourly air temperature, relative humidity, cloud coverage, precipitation, wind speed and direction, month of year, hour of day, sea-level air pressure, and the difference between the air pressure at the station and the sea level were used as inputs in multivariate LSTM and ConvLSTM. They showed that the seasonal naive approach has the worst performance for most of the prediction horizons. While the SARIMA and univariate LSTM network performed well for the first two- to three-hour air forecasts, the ConvLSTM and multivariate LSTM showed a better performance for longer forecast horizons. In the stations with large variations of air temperature during the day, convLSTM outperformed other methods.
Lee et al. [24] employed three neural network models (namely, MLP, LSTM, and CNN) to forecast the average, minimum, and maximum air temperatures for the next day in three regions of South Korea. They tried both hourly and daily air temperature, precipitation, humidity, vapor pressure, dew point temperature, atmospheric pressure, sea-level pressure, hours of sunshine, solar radiation, cloud cover, ground surface temperature, and wind speed and direction as inputs in the previous 30 days. Hourly input data provided better information for daily air temperature forecasting than daily input data. Overall, the CNN with hourly input data showed better performance than the MLP and LSTM.

5. Discussion

This study reviewed the recent (2005–2020) articles that utilized ANN methodologies to forecast air temperature. For this purpose, 26 publications were chosen, categorized according to their input variables, and finally discussed. As described in Section 4, neural network approaches have been applied extensively in the context of air temperature forecasting. The summary of the reviewed papers is provided in Table 1. As can be seen, different types of neural network approaches, such as MLP, FFBF, GRNN, RBF, CRNN, RNN, and LSTM, were used for forecasting air temperature. Some studies in Table 1 also compared the performance of neural network techniques with those of other machine learning methods, such as SVM, GEP, and RF [36,41]. They stated that the ANN approaches often provide more accurate air temperature forecasts. Additionally, only a few numbers of studies used deep learning methods, such as RNN and LSTM, although they are highly promising.
A variety of meteorological and geographical variables have been used as inputs in the neural network approaches. They include air temperature, wind speed and direction, air pressure, precipitation, solar radiation, relative humidity, cloudiness, latitude, longitude, and altitude [24,25,28]. Among them, air temperature, relative humidity, precipitation, and wind speed are found to be the common inputs for air temperature predictions. While various meteorological variables have been fed into different types of NN approaches as inputs, the geographical inputs (i.e., latitude, longitude, altitude) have been used only in simple NN techniques (e.g., MLP and FFNN) rather than complex ones (e.g., RNN and LSTM). However, it should be noted that choosing the best input variables for a particular NN approach is difficult due to the complexity of the problem and limited number of studies.
Moreover, it is found that neural network methods are mainly applied to short-term air temperature forecasting. Only a few studies were dedicated to the medium- and long-term forecasting of air temperature, which mainly utilized the RNN and LSTM models due to their capabilities in capturing the temporal trends of air temperature time series [32]. RNN and LSTM are known as efficient methods for long-term forecasting of hydrologic variables [54,55]. However, there are only eight studies that forecasted air temperature via RNN and LSTM. It is shown that the accuracy of the abovementioned models varies mainly with the input variables and network structure. Using ancillary data (e.g., rainfall, air pressure, and humidity) in the deep learning methods improves air temperature predictions.
The literature shows that the performance of NN models is dependent on the network configuration, such as the number of hidden neurons and layers [21,22,45]. Since there is no rule for choosing the optimum number of hidden neurons and layers to avoid underfitting and overfitting of the network, they were mostly determined by trial and error [20,44]. These optimal numbers could be found by searching algorithms, such as GA [32]. In general, increasing the size of hidden layers and neurons allows the neural networks to learn complicated processes more robustly, ultimately enhancing their forecasting abilities. However, a number of studies showed that adding hidden layers and neurons did not always increase the accuracy of the network [21,44]. Based on the literature, it is still difficult to pick the best methodology for air temperature forecasting. As can be seen in Table 1, there are a few studies that take advantage of optimization techniques, such as GA, to tune the hyperparameters of neural networks for a more accurate air temperature prediction. Hybrid models can improve the accuracy of air temperature predictions [56]. However, coupling the ANN models with optimization algorithms and developing hybrid approaches have not yet been studied sufficiently. Therefore, the effectiveness of these methods should be investigated thoroughly in predicting hydrologic variables and of course, air temperature forecasting can highly benefit from them.

6. Conclusions and Future Research Work

In this paper, we conducted a comprehensive review of studies that forecasted air temperature via neural networks. The review showed that air temperature could be forecasted successfully by various types of artificial neural networks (ANNs).
According to the reviewed studies, MLP and a lesser extent RBF, GRNN, and ward-style ANN models were used to predict air temperature. It is noteworthy that the selection of input variables highly affects the robustness of ANNs. The historical air temperature and other micrometeorological variables were used as inputs in ANNs. Additionally, the number of hidden neurons plays an important role in the accuracy of predictions. Selection of the number of the hidden neurons is mostly performed by trial and error.
Overall, the neural network models have been shown to be promising and can provide reliable air temperature forecasts. It is anticipated that neural networks play an important role in the future of air temperature prediction. The information presented in this review paper helps us understand the current state of air temperature predictions.
The following directions can be considered for future works:
  • The combination of neural networks with many optimization algorithms (e.g., particle swarm algorithm (PSO), harmony search, genetic programming, etc.) has not been applied to air temperature forecasting. The meta-learning approaches can be utilized in the future to forecast air temperature more accurately. They can be combined with neural network models to strengthen the model robustness since the heuristic algorithm can optimize the hyperparameters of ANNs.
  • The effect analysis of relevant meteorological (e. g., maximum, minimum, and mean temperature, rainfall, and relative humidity) and geographical (e.g., latitude, longitude, and elevation) variables should be performed to improve the accuracy of air temperature prediction. Thus, the feature selection techniques, such as recursive feature elimination, random forest, and correlation coefficient, should be employed to select the useful input variables for air temperature forecast.
  • Comparison of the performance of ANN-based models with other soft computing approaches, such as support vector machines (SVMs), autoregressive moving average model (ARMA), and extreme learning machines to determine the best approach to forecast air temperature over different hydrologic conditions and time horizons.
  • The long-term air temperature prediction has an important role in human lives and other sectors, such as energy consumption and agriculture. Hence, it should be investigated more deeply in future studies via the RNN and LSTM models. Their performance should also be compared with other medium or long-range models, such as the European Centre for Medium-Range Weather Forecasts (ECMWF) model and global weather forecast models [57].

Author Contributions

T.T.K.T. conceived and designed the study and prepared the original draft. S.M.B. and S.J.K. supervised the study and revised the manuscript. H.V. reviewed and improved the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the research invigoration program of 2020 Gyeongnam National University of Science and Technology.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pachauri, R.K.; Allen, M.R.; Barros, V.R.; Broome, J.; Cramer, W.; Christ, R.; Church, J.A.; Clarke, L.; Dahe, Q.; Dasgupta, P.; et al. Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change | EPIC. Available online: https://epic.awi.de/id/eprint/37530/ (accessed on 11 February 2021).
  2. Tajfar, E.; Bateni, S.M.; Lakshmi, V.; Ek, M. Estimation of surface heat fluxes via variational assimilation of land surface temperature, air temperature and specific humidity into a coupled land surface-atmospheric boundary layer model. J. Hydrol. 2020, 583, 124577. [Google Scholar] [CrossRef]
  3. Tajfar, E.; Bateni, S.M.; Margulis, S.A.; Gentine, P.; Auligne, T. Estimation of turbulent heat fluxes via assimilation of air temperature and specific humidity into an atmospheric boundary layer model. J. Hydrometeorol. 2020, 21, 205–225. [Google Scholar] [CrossRef]
  4. Valipour, M.; Bateni, S.M.; Gholami Sefidkouhi, M.A.; Raeini-Sarjaz, M.; Singh, V.P. Complexity of Forces Driving Trend of Reference Evapotranspiration and Signals of Climate Change. Atmosphere (Basel) 2020, 11, 1081. [Google Scholar] [CrossRef]
  5. Lan, L.; Lian, Z.; Pan, L. The effects of air temperature on office workers’ well-being, workload and productivity-evaluated with subjective ratings. Appl. Ergon. 2010, 42, 29–36. [Google Scholar] [CrossRef]
  6. Schulte, P.A.; Bhattacharya, A.; Butler, C.R.; Chun, H.K.; Jacklitsch, B.; Jacobs, T.; Kiefer, M.; Lincoln, J.; Pendergrass, S.; Shire, J.; et al. Advancing the framework for considering the effects of climate change on worker safety and health. J. Occup. Environ. Hyg. 2016, 13, 847–865. [Google Scholar] [CrossRef] [Green Version]
  7. Sharma, N.; Sharma, P.; Irwin, D.; Shenoy, P. Predicting solar generation from weather forecasts using machine learning. In Proceedings of the 2011 IEEE International Conference on Smart Grid Communications, Brussels, Belgium, 17–20 October 2011; pp. 528–533. [Google Scholar]
  8. Sardans, J.; Peñuelas, J.; Estiarte, M. Warming and drought alter soil phosphatase activity and soil P availability in a Mediterranean shrubland. Plant Soil 2006, 289, 227–238. [Google Scholar] [CrossRef]
  9. Green, M.A. General temperature dependence of solar cell performance and implications for device modelling. Prog. Photovoltaics Res. Appl. 2003, 11, 333–340. [Google Scholar] [CrossRef]
  10. Tang, C.; Crosby, B.T.; Wheaton, J.M.; Piechota, T.C. Assessing streamflow sensitivity to temperature increases in the Salmon River Basin, Idaho. Glob. Planet. Change 2012, 88–89, 32–44. [Google Scholar] [CrossRef]
  11. Jovic, S.; Nedeljkovic, B.; Golubovic, Z.; Kostic, N. Evolutionary algorithm for reference evapotranspiration analysis. Comput. Electron. Agric. 2018, 150, 1–4. [Google Scholar] [CrossRef]
  12. Marzo, A.; Trigo, M.; Alonso-Montesinos, J.; Martínez-Durbán, M.; López, G.; Ferrada, P.; Fuentealba, E.; Cortés, M.; Batlles, F.J. Daily global solar radiation estimation in desert areas using daily extreme temperatures and extraterrestrial radiation. Renew. Energy 2017, 113, 303–311. [Google Scholar] [CrossRef]
  13. Smith, D.M.; Cusack, S.; Colman, A.W.; Folland, C.K.; Harris, G.R.; Murphy, J.M. Improved Surface Temperature Prediction for the Coming Decade from a Global Climate Model. Science 2007, 317, 796–799. [Google Scholar] [CrossRef] [Green Version]
  14. Yang, T.; Sun, F.; Gentine, P.; Liu, W.; Wang, H.; Yin, J.; Du, M.; Liu, C. Evaluation and machine learning improvement of global hydrological model-based flood simulations. Environ. Res. Lett. 2019, 14. [Google Scholar] [CrossRef]
  15. Lee, J.; Kim, C.G.; Lee, J.E.; Kim, N.W.; Kim, H. Application of artificial neural networks to rainfall forecasting in the Geum River Basin, Korea. Water (Switzerland) 2018, 10, 1448. [Google Scholar] [CrossRef] [Green Version]
  16. Zou, Q.; Xiong, Q.; Li, Q.; Yi, H.; Yu, Y.; Wu, C. A water quality prediction method based on the multi-time scale bidirectional long short-term memory network. Environ. Sci. Pollut. Res. 2020, 27, 16853–16864. [Google Scholar] [CrossRef] [PubMed]
  17. Altan Dombayci, Ö.; Gölcü, M. Daily means ambient temperature prediction using artificial neural network method: A case study of Turkey. Renew. Energy 2009, 34, 1158–1161. [Google Scholar] [CrossRef]
  18. Cifuentes, J.; Marulanda, G.; Bello, A.; Reneses, J. Air temperature forecasting using machine learning techniques: A review. Energies 2020, 13, 4215. [Google Scholar] [CrossRef]
  19. Chattopadhyay, S.; Jhajharia, D.; Chattopadhyay, G. Univariate modelling of monthly maximum temperature time series over northeast India: Neural network versus Yule-Walker equation based approach. Meteorol. Appl. 2011, 18, 70–82. [Google Scholar] [CrossRef]
  20. Ustaoglu, B.; Cigizoglu, H.K.; Karaca, M. Forecast of daily mean, maximum and minimum temperature time series by three artificial neural network methods. Meteorol. Appl. 2008, 15, 431–445. [Google Scholar] [CrossRef]
  21. Fahimi Nezhad, E.; Fallah Ghalhari, G.; Bayatani, F. Forecasting Maximum Seasonal Temperature Using Artificial Neural Networks “Tehran Case Study”. Asia-Pacific J. Atmos. Sci. 2019, 55, 145–153. [Google Scholar] [CrossRef]
  22. Smith, B.A.; Mcclendon, R.W.; Hoogenboom, G. Improving Air Temperature Prediction with Artificial Neural Networks. Int. J. Comput. Inf. Eng. 2007, 1, 3159. [Google Scholar]
  23. Zhang, Z.; Dong, Y.; Yuan, Y. Temperature Forecasting via Convolutional Recurrent Neural Networks Based on Time-Series Data. Complexity 2020, 2020. [Google Scholar] [CrossRef]
  24. Kreuzer, D.; Munz, M.; Schlüter, S. Short-term temperature forecasts using a convolutional neural network — An application to different weather stations in Germany. Mach. Learn. with Appl. 2020, 2, 100007. [Google Scholar] [CrossRef]
  25. Lee, S.; Lee, Y.S.; Son, Y. Forecasting daily temperatures with different time interval data using deep neural networks. Appl. Sci. 2020, 10, 1609. [Google Scholar] [CrossRef] [Green Version]
  26. Salcedo-Sanz, S.; Deo, R.C.; Carro-Calvo, L.; Saavedra-Moreno, B. Monthly prediction of air temperature in Australia and New Zealand with machine learning algorithms. Theor. Appl. Climatol. 2016, 125, 13–25. [Google Scholar] [CrossRef]
  27. Rajendra, P.; Murthy, K.V.N.; Subbarao, A.; Boadh, R. Use of ANN models in the prediction of meteorological data. Model. Earth Syst. Environ. 2019, 5, 1051–1058. [Google Scholar] [CrossRef]
  28. Bilgili, M.; Sahin, B. Prediction of long-term monthly temperature and rainfall in Turkey. Energy Sources, Part A Recover. Util. Environ. Eff. 2010, 32, 60–71. [Google Scholar] [CrossRef]
  29. De Jesgs, O.; Hagan, M.T. Backpropagation Through Time for a General Class of Recurrent Network. In Proceedings of the International Joint Conference on Neural Networks (Cat. No.01CH37222), Washington, DC, USA, 15–19 July 2001; pp. 2638–2643. [Google Scholar] [CrossRef]
  30. Hochreiter, S. The vanishing gradient problem during learning. Int. J. Uncertainty, Fuzziness Knowledge-Based Syst. 1998, 6, 107–116. [Google Scholar] [CrossRef] [Green Version]
  31. Hochreiter, S. Long Short-Term Memory. Neural Comput. 1997, 1780, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
  32. Tran, T.K.T.; Lee, T.; Shin, J.-Y.; Kim, J.-S.; Kamruzzaman, M. Deep Learning-Based Maximum Temperature Forecasting Assisted with Meta-Learning for Hyperparameter Optimization. Atmosphere (Basel) 2020, 11, 487. [Google Scholar] [CrossRef]
  33. Abhishek, K.; Singh, M.P.; Ghosh, S.; Anand, A. Weather Forecasting Model using Artificial Neural Network. Procedia Technol. 2012, 4, 311–318. [Google Scholar] [CrossRef] [Green Version]
  34. Kumar, P.; Kashyap, P.; Ali, J. Temperature Forecasting using Artificial Neural Networks (ANN). J. Hill Agric. 2013. [Google Scholar] [CrossRef]
  35. Tran, T.T.K.; Lee, T.; Kim, J.S. Increasing neurons or deepening layers in forecasting maximum temperature time series? Atmosphere (Basel) 2020, 11, 1072. [Google Scholar] [CrossRef]
  36. Li, C.; Zhang, Y.; Zhao, G. Deep Learning with Long Short-Term Memory Networks for Air Temperature Predictions. In Proceedings of the 2019 International Conference on Artificial Intelligence and Advanced Manufacturing (AIAM), Dublin, Ireland, 16–18 October 2019; pp. 243–249. [Google Scholar] [CrossRef]
  37. Afzali, M.; Afzali, A.; Zahedi, G. The Potential of Artificial Neural Network Technique in Daily and Monthly Ambient Air Temperature Prediction. Int. J. Environ. Sci. Dev. 2012, 3, 33–38. [Google Scholar] [CrossRef] [Green Version]
  38. De, S.S.; Debnath, A. Artificial Neural Network Based Prediction of Maximum and Minimum Temperature in the Summer Monsoon Months over India. Appl. Phys. Res. 2009, 1, 37–44. [Google Scholar] [CrossRef] [Green Version]
  39. Smith, B.A.; Hoogenboom, G.; McClendon, R.W. Artificial neural networks for automated year-round temperature prediction. Comput. Electron. Agric. 2009, 68, 52–61. [Google Scholar] [CrossRef]
  40. Kisi, O.; Shiri, J. Prediction of long-term monthly air temperature using geographical inputs. Int. J. Climatol. 2014, 34, 179–186. [Google Scholar] [CrossRef]
  41. Kisi, O.; Sanikhani, H. Modelling long-term monthly temperatures by several data-driven methods using geographical inputs. Int. J. Climatol. 2015, 35, 3834–3846. [Google Scholar] [CrossRef]
  42. Şahin, M. Modelling of air temperature using remote sensing and artificial neural network in Turkey. Adv. Sp. Res. 2012, 50, 973–985. [Google Scholar] [CrossRef]
  43. Akram, M.; El, C. Sequence to Sequence Weather Forecasting with Long Short-Term Memory Recurrent Neural Networks. Int. J. Comput. Appl. 2016, 143, 7–11. [Google Scholar] [CrossRef]
  44. Jallal, M.A.; Chabaa, S.; El Yassini, A.; Zeroual, A.; Ibnyaich, S. Air temperature forecasting using artificial neural networks with delayed exogenous input. In Proceedings of the 2019 International Conference on Wireless Technologies, Embedded and Intelligent Systems (WITS), Fez, Morocco, 3–4 April 2019; pp. 1–6. [Google Scholar] [CrossRef]
  45. Park, I.; Kim, H.S.; Lee, J.; Kim, J.H.; Song, C.H.; Kim, H. Temperature Prediction Using the Missing Data Refinement Model Based on a Long Short-Term Memory Neural Network. Atmosphere (Basel) 2019, 10, 718. [Google Scholar] [CrossRef] [Green Version]
  46. Huang, Y.; Zhao, H.; Huang, X. A Prediction Scheme for Daily Maximum and Minimum Temperature Forecasts Using Recurrent Neural Network and Rough set. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristowl, UK, 2019; Volume 237. [Google Scholar] [CrossRef]
  47. Sundaram, M.; Prakash, M.; Surenther, I.; Balaji, N.V.; Kannimuthu, S. Weather Forecasting using Machine Learning Techniques. Test Eng. Manag. 2020, 83, 15264–15273. [Google Scholar] [CrossRef]
  48. Roy, D.S. Forecasting the Air Temperature at a Weather Station Using Deep Neural Networks. Procedia Comput. Sci. 2020, 178, 38–46. [Google Scholar] [CrossRef]
  49. Kabir, S.; Pender, G.; Patidar, S. Investigating capabilities of machine learning techniques in forecasting stream flow. In Institution of Civil Engineers-Water Management; Thomas Telford Ltd.: London, UK, 2020; Volume 173, pp. 69–86. [Google Scholar]
  50. Yuan, X.; Chen, C.; Lei, X.; Yuan, Y.; Muhammad Adnan, R. Monthly runoff forecasting based on LSTM–ALO model. Stoch. Environ. Res. Risk Assess. 2018, 32, 2199–2212. [Google Scholar] [CrossRef]
  51. Abbot, J.; Marohasy, J. Application of artificial neural networks to rainfall forecasting in Queensland, Australia. Adv. Atmos. Sci. 2012, 29, 717–730. [Google Scholar] [CrossRef]
  52. Mekanik, F.; Imteaz, M.A.; Gato-Trinidad, S.; Elmahdi, A. Multiple regression and Artificial Neural Network for long-term rainfall forecasting using large scale climate modes. J. Hydrol. 2013, 503, 11–21. [Google Scholar] [CrossRef]
  53. Bessafi, M.; Lasserre-Bigorry, A.; Neumann, C.J.; Pignolet-Tardan, F.; Payet, D.; Lee-Ching-Ken, M. Statistical prediction of tropical cyclone motion: An analog-CLIPER approach. Weather Forecast. 2002, 17, 821–831. [Google Scholar] [CrossRef]
  54. Hrnjica, B.; Bonacci, O. Lake Level Prediction using Feed Forward and Recurrent Neural Networks. Water Resour. Manag. 2019, 33, 2471–2484. [Google Scholar] [CrossRef]
  55. Liu, P.; Wang, J.; Sangaiah, A.; Xie, Y.; Yin, X. Analysis and Prediction of Water Quality Using LSTM Deep Neural Networks in IoT Environment. Sustainability 2019, 11, 2058. [Google Scholar] [CrossRef] [Green Version]
  56. Azad, A.; Pirayesh, J.; Farzin, S.; Malekani, L.; Moradinasab, S.; Kisi, O. Application of heuristic algorithms in improving performance of soft computing models for prediction of min, mean and max air temperatures. Eng. J. 2019, 23, 83–98. [Google Scholar] [CrossRef]
  57. Frnda, J.; Durica, M.; Nedoma, J.; Zabka, S.; Martinek, R.; Kostelansky, M. A weather forecast model accuracy analysis and ecmwf enhancement proposal by neural network. Sensors (Switzerland) 2019, 19, 5144. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. An illustration of a basic neuron.
Figure 1. An illustration of a basic neuron.
Water 13 01294 g001
Figure 2. The architecture of a recurrent neural network.
Figure 2. The architecture of a recurrent neural network.
Water 13 01294 g002
Figure 3. The structure of a long short-term memory cell.
Figure 3. The structure of a long short-term memory cell.
Water 13 01294 g003
Table 1. Description of selected studies.
Table 1. Description of selected studies.
ReferenceInputDataRegionType of ModelConfiguration of Hidden LayerOutput
Ustaoglu et al. [20]Past seven daily mean, maximum and minimum air temperature 1989 to 2003TurkeyANN (feed-forward back propagation (FFBP), (2) radial basis function (RBF) and, (3) generalized regression neural network (GRNN))FFBF (7,5,1)
RBF (Tmean, Tmin: 7,0.99,1; Tmax: 7,0.55,1)
GRNN (Tmean: 7,0.05,1; Tmax: 7,0.08,1, Tmin: 7,0.07,1)
Daily mean, maximum and minimum temperature
Chattopadhyay et al. [19]Previous maximum temperature values 1901–2003IndiaMultilayer Perceptron (MLP), Generalized Feed Forward Neural Network (GFFNN), and Modular Neural Network (MNN).Number of hidden nodes determined by the number of adjustable parametersMean monthly maximum temperature
Abhishek et al. [33]Historical 10-year data of a particular day1999–2009CanadaFeed-forward ANN5-hidden-layer network with 10 or 16 neurons/layerDaily maximum temperature
Kumar et al. [34]Six previous weekly mean temperature2002–2011IndiaFeed-forward ANN2 hidden layers with 5 neurons/layerOne week ahead mean temperature
Tran et al. [32]Past daily maximum temperature (from 7–36)1976–2015South KoreaTraditional ANN, RNN, LSTMHidden nodes: 1–20
Hidden layer: 1–3
Daily maximum temperature for 1–15 days in advance
Tran and Lee [35]Six previous daily maximum temperature1976–2015South KoreaTraditional ANNHidden layer: 1, 3, 5
Parameters: 49, 113, 169, 353, 1001
One day ahead maximum temperature
Zhang et al. [23]Four past temperature data map series1952–2018ChinaConvolutional recurrent neural network (CRNN)3 convolution layers followed a LSTM and a dense layerFour future temperature data map series
Li et al. [36]Historical time series of temperature 2009 to 2018ChinaStacked LSTM3 LSTMs (20, 10, and 4 hidden cells)—fully connected layer (4 neurons) One half-hour ahead temperature
Six past observationsDNNThree hidden layers with 12, 8 and 4 neurons
Afzali et al. [37]Daily and monthly mean, minimum and maximum ambient air temperature1961–2004IranFeed-forward ANNOne hidden layer with 15 neuronsMaximum minimum and mean ambient air temperature developing ANN models for one day and one month ahead
De and Debnath [38]December to May maximum and minimum temperature 1901–2003.IndiaFeed-forward NNOne hidden layer with 2 neuronsMaximum and minimum temperature monsoon months (June–August)
Smith et al. [22]Up to prior 24 h: temperature, wind speed, rainfall, relative humidity
solar radiation, time-of-day, day-of-year
2000–2005Georgia, USAWard-style ANNHidden Layer: 3 parallel slabs
Hidden nodes: (2–75) nodes per slab
1 h to 12 h air Temperature
Smith et al. [39]Up to prior 24 h: temperature, wind speed, rainfall, relative humidity
solar radiation, time-of-day, day-of-year, hourly rate of change in each of the above weather variables in the last 24 h
1997 to 2005Georgia, USAWard-style ANNHidden Layer: 3 parallel slabs
Hidden nodes: 40 nodes per slab
1 h to 12 h air temperature
Altan Dombayci and Gölcü [17]Month, day, and mean temperature of the previous day2003–2006TurkeyANN (Levenberg–Marquardt (LM) feed-forward backpropagation algorithms)One hidden layer with 6 neuronsDaily mean ambient temperatures
Bilgili and Sahin [28]Latitude, longitude, altitude, and month1975 to 2006TurkeyMLPOne hidden layer with 32 neuronsMonthly temperature
Kisi and Shiri [40]Latitude, longitude, altitude, and month1956 and 2010IranFeed-forward ANNOne hidden layer with 4 neuronsMonthly temperature
Kisi and Sanikhani [41]Number of the months, station latitude, longi- tude and altitude values1986–2000IranFeed-forward ANNOne hidden layerMonthly air temperature values
Şahin [42]City, month, altitude, latitude, longitude, monthly mean land surface temperatures1995 to 2005TurkeyFeed-forward ANNOne hidden layer with 14 neurons
One hidden layer with 24 neurons
Monthly mean air temperature
Salcedo-Sanz et al. [26]Temperature in the previous month, southern oscillation index (SOI), indian ocean dipole (IOD), and pacific decadal oscillation (PDO)1900–2010 for urban and 1910–2010 for rural stations in Australia and 1930–2010 in New Zealand’s stationsAustralia and New ZealandMultilayer perceptron MLPNAMonthly mean air temperature
Akram and El [43]24 (or 72) temperature values2000–2015MoroccoLSTMTwo LSTM layers and a fully connected hidden layer in between with a 100 neuron24 and 72 h
Jallal et al. [44]3 previous values of air temperature (t-1, t-2, t-3) and global solar radiation GSR (t, t-1, t-2, t-3, t-4)2014MoroccoMLP2 hidden layers with 5 and 8 neuronsHalf hourly air temperature (t)
Park et al. [45]24 h weather data: hourly wind speed, wind direction, and humidity November 1981 to December 2017South KoreaLSTMNumber of LSTM layers and number of hidden nodes were set to 4 and 384Temperature up to 14 days in advance
Huang et al. [46]8–9 temperature factors from 50 CLIPER predictorsJan, 2015–Jun, 201814 stations in Guangxi, China Recurrent Neural Network (RNN)One hidden layer with 10 neurons24 h daily maximum and minimum temperature
Sundaram et al. [47]Air temperature, pressure, Relative humidity, Mean wind direction, Total cloud cover, Horizontal visibility, Dew point temperature2006–2018IndiaMLPFive hidden layers with 16, 32, 16, 5, and 1 neurons Temperature for eight weeks
Roy [48]Past seven days of average wind speed, precipitation, snowfall, snow depth, average temperature, maximum temperature and minimum temperature 1/1/2009 to 1/1/2019John F. Kennedy International Airport, NYMLP, LSTM, CNN+LSTM MLP: 2 layers with 16 neurons per layer
LSTM: 16 hidden neurons followed by a dense layer
CNN+LSTM: has one convolutional layer (32 filters with a kernel size of 5) followed by a LSTM cell containing 16 neurons and finally a dense layer
Average temperature for the next day and 10 days ahead
Kreuzer et al. [24]Air temperature, relative humidity, relative air pressure, sea level air pressure, cloudiness, wind speed, wind direction, precipitation, month, hour of day 2009–2018GermanyUnivariate LSTM, multivariate LSTM, ConvLSTM Univariate and multivariate LSTM: 1 hidden layer with 32 hidden neurons
ConvLSTM: 6 convolutional layers + 2 LSTM + 2 dense layers
24 h air temperature
Lee et al. [25]Air temperature, precipitation, humidity, vapor pressure, dew point temperature, air pressure, sea level pressure, hours of sunshine, solar radiation, total cloud cover, middle-and low-level cloud cover, ground surface temperature, wind speed and direction2009–2018South KoreaMLP, LSTM, CNNMLP: 6 hidden layers
LSTM (daily input): 2 hidden LSTM + 3 dense layers
LSTM (hourly input): 2 hidden LSTM + 6 dense layers
CNN: 5 convolutional layers + 2 dense layers
Daily average, minimum, and maximum temperature
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tran, T.T.K.; Bateni, S.M.; Ki, S.J.; Vosoughifar, H. A Review of Neural Networks for Air Temperature Forecasting. Water 2021, 13, 1294. https://doi.org/10.3390/w13091294

AMA Style

Tran TTK, Bateni SM, Ki SJ, Vosoughifar H. A Review of Neural Networks for Air Temperature Forecasting. Water. 2021; 13(9):1294. https://doi.org/10.3390/w13091294

Chicago/Turabian Style

Tran, Trang Thi Kieu, Sayed M. Bateni, Seo Jin Ki, and Hamidreza Vosoughifar. 2021. "A Review of Neural Networks for Air Temperature Forecasting" Water 13, no. 9: 1294. https://doi.org/10.3390/w13091294

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop