Probabilistic prediction of photovoltaic power using multi-task learning and large language models

Researchers integrate wavelet transforms and Meta´s LLaMA-based large language models for next-generation probabilistic forecasting of solar power output, leveraging multi-task learning to improve accuracy with artificial intelligence.

Solar power generation is inherently variable, making accurate forecasting crucial for integrating photovoltaic energy into the grid. A new study presents an advanced approach to predict photovoltaic power output, combining signal processing with deep learning. The researchers first utilize wavelet transform techniques to decompose complex time series data from photovoltaic systems into a more manageable form. This preprocessing step allows the subsequent models to focus on key temporal features that influence power generation.

In the core of the proposed method, the team employs a large language model (LLM) derived from Meta´s LLaMA architecture, specifically adapted for multi-task learning. By using a multi-task framework, the model simultaneously learns from multiple related prediction targets or tasks, enhancing its ability to generalize across different situations and data sets. Neural networks based on large language models are shown to harness the shared structures within the data, leading to significant improvements in probabilistic forecasting accuracy compared to traditional or single-task models.

The results indicate that this hybrid artificial intelligence system can better capture the stochastic nature of solar energy production. By outputting probabilistic forecasts, the approach provides more informative uncertainty estimates for operators and planners in the energy sector. The combination of wavelet-based signal decomposition and a meta artificial intelligence LLaMA model demonstrates a promising direction for next-generation renewable energy forecasting, helping utilities better manage supply and demand while integrating larger shares of solar power into the energy mix.

73

Impact Score

JEDEC outlines LPDDR6 expansion for data centers

JEDEC has previewed planned updates to LPDDR6 aimed at pushing the memory standard beyond mobile devices and into selected data center and accelerated computing use cases. The roadmap includes higher-capacity packaging options, flexible metadata support, 512 GB densities, and a new SOCAMM2 module standard.

Tsmc debuts A13 process technology

Tsmc has introduced its A13 process at its 2026 North America Technology Symposium as a tighter version of A14 aimed at next-generation Artificial Intelligence, high performance computing, and mobile designs. The company positions the node as a more compact and efficient option with backward-compatible design rules for faster migration.

Google unveils eighth-generation tensor processor units

Google introduced its eighth generation of custom tensor processor units with separate designs for training and inference. The new TPU 8t and TPU 8i are aimed at large-scale model training, serving, and agentic workloads.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.