The degradation of solar cells is a critical factor affecting the efficiency and lifespan of photovoltaic (PV) systems. Detecting and diagnosing these degradations early can significantly improve the maintenance and operation of solar farms. This study proposes a novel approach using deep learning to predict solar cell degradation by analyzing thermal images captured via unmanned aerial vehicles (UAVs). A hybrid deep learning model combining Inception modules, Bidirectional LSTM layers, and Multi-Head Attention mechanisms is developed to process thermal imaging data and classify solar cell degradation into two categories: “defected” and “non-defected” degradation. The dataset consisted of thermal images of solar panels, preprocessed to standardize image size and enhance contrast for improved model performance. The proposed model demonstrated remarkable results, achieving an accuracy of 89.65%, precision of 93.67%, and recall of 85.05%. In addition, the model yielded an F1 score of 89.15% and an Area Under the ROC Curve (AUC) of 0.9613, indicating its effectiveness in distinguishing degraded panels from non-degraded ones. This research shows that combining advanced neural network architectures can significantly enhance the accuracy of solar cell degradation detection. The model’s success highlights its potential application in automated, large-scale solar farm inspections, providing a reliable and efficient tool for future predictive maintenance in the solar energy industry.
The proposed IABiLSTM-Net model demonstrates strong potential for practical deployment in smart energy systems. By automating solar panel degradation detection, the system offers a low-cost, scalable, and accurate solution for long-term solar farm maintenance and energy efficiency improvement.
Although effective, the model was trained on a relatively small dataset and evaluated on a single environmental condition. Variability in thermal image quality due to weather or camera resolution can affect model generalizability. Real-world deployment may also require hardware integration and calibration.
Future improvements will focus on expanding the dataset with multi-weather images, incorporating multispectral data, and extending classification to include degrees of degradation. Integration with drone-based real-time inspection systems is also a promising direction.