نوع مقاله : مقاله پژوهشی
نویسندگان
گروه مهندسی و مدیریت آب، دانشگاه تربیت مدرس، تهران، ایران
چکیده
کلیدواژهها
موضوعات
عنوان مقاله [English]
نویسندگان [English]
Groundwater level forecasting is a crucial approach in water resource management and planning. This forecasting can assist in improving water resource management, especially in regions facing water crises. In recent years, the use of artificial intelligence for forecasting groundwater levels has gained significant attention. These models can simulate complex and nonlinear relationships between data and are widely used in areas where accurate and comprehensive hydrological data is not available. In this study, the Long Short-Term Memory (LSTM) model was used to forecast groundwater levels in the Saadat Abad area of the Tashk-Bakhtegan Basin in Fars Province. The main objective of this study was to evaluate the performance of the LSTM model compared to traditional models and to analyze the impact of different activation functions on the accuracy of groundwater level forecasting. Bayesian optimization was employed to optimize the model's hyperparameters, which significantly improved the forecasting accuracy and the simulation of long-term dependencies between input data.The results of this study showed that the LSTM model is capable of forecasting groundwater level fluctuations and long-term trends with high accuracy. Additionally, a comparison of different activation functions revealed that the ReLU activation function with NSE value of 0.99, an R² value of 0.97, and an RMSE of 0.67 m, simulated the changes in groundwater levels. Furthermore, it was observed that using GPU significantly reduced processing time. Specifically, the execution time with CPU was 31 minutes, while with GPU it was only 9 minutes. This model demonstrated a high ability to simulate complex temporal patterns and accurately forecast groundwater levels, making it an efficient tool for groundwater resource management in regions with limited data.
کلیدواژهها [English]