Symmetry, cilt.17, sa.7, 2025 (SCI-Expanded)
As a classical statistical method, multiple regression is widely used for forecasting tasks in power, medicine, finance, and other fields. The rise of machine learning has led to the adoption of neural networks, particularly Long Short-Term Memory (LSTM) models, for handling complex forecasting problems, owing to their strong ability to capture temporal dependencies in sequential data. Nevertheless, the performance of LSTM models is highly sensitive to hyperparameter configuration. Traditional manual tuning methods suffer from inefficiency, excessive reliance on expert experience, and poor generalization. Aiming to address the challenges of complex hyperparameter spaces and the limitations of manual adjustment, an enhanced sparrow search algorithm (ISSA) with adaptive parameter configuration was developed for LSTM-based multivariate regression frameworks, where systematic optimization of hidden layer dimensionality, learning rate scheduling, and iterative training thresholds enhances its model generalization capability. In terms of SSA improvement, first, the population is initialized by the reverse learning strategy to increase the diversity of the population. Second, the mechanism for updating the positions of producer sparrows is improved, and different update formulas are selected based on the sizes of random numbers to avoid convergence to the origin and improve search flexibility. Then, the step factor is dynamically adjusted to improve the accuracy of the solution. To improve the algorithm’s global search capability and escape local optima, the sparrow search algorithm’s position update mechanism integrates Lévy flight for detection and early warning. Experimental evaluations using benchmark functions from the CEC2005 test set demonstrated that the ISSA outperforms PSO, the SSA, and other algorithms in optimization performance. Further validation with power load and real estate datasets revealed that the ISSA-LSTM model achieves superior prediction accuracy compared to existing approaches, achieving an RMSE of 83.102 and an (Formula presented.) of 0.550 during electric load forecasting and an RMSE of 18.822 and an (Formula presented.) of 0.522 during real estate price prediction. Future research will explore the integration of the ISSA with alternative neural architectures such as GRUs and Transformers to assess its flexibility and effectiveness across different sequence modeling paradigms.