Choose language
UK & Ireland
Choose different website
Current website:
UK & Ireland
Website / Country
Languages
Social Media
Website / Country
Languages
Social Media

Putting satellite data to the test with BayWa r.e.’s solar experts

In an ever more data-hungry world how do we ensure the data we use is of the highest possible quality? How can we know that our clients and in-house teams are making decisions based on the right information?

The simple answer: we can test the data ourselves. But that’s no straightforward matter. Here, we explain how we invested in benchmarking satellite data from three competing providers to determine which service better met the needs of our clients.

Setting our objectives

Satellite-based solar irradiation data are widely used in the renewable energy industry. During the last decade the photovoltaic industry has adopted the use of satellite measurements for various purposes, including but not limited to:

  • new solar site identification,
  • estimation of annual energy yield during the development phase of a project,
  • identifying on-site measurement bias or degradation during the operational phase of a solar site,
  • backfilling missing data for on-site measurements,
  • or using it as a consistent reliable measurement to calculate KPIs for operational sites.

At BayWa r.e we have identified the importance of satellite-based irradiation data, and we use it often as an additional data source for all our operational sites.

One of the most important use cases for satellite-based irradiance data for BayWa r.e. is backfilling on-site measurements when readings from pyranometers (which measure irradiance from the Earth’s surface) are missing or inaccurate.

For the above-mentioned reasons, we have identified the need to benchmark different services using data from the sites we manage, with a focus on reproducing site conditions as those are recorded by on-site measurements. This way we have set our own targets for this case study that is specific to the peculiarities of the assets we manage and our use cases.

BayWa r.e. have carefully selected the service providers by approaching the most recognised with good tracking record and services that have been well proven.

Based on the above we have identified three service providers and we have benchmarked them against eight different instruments across four solar sites.

Defining a methodology

Our analysis happened in three steps:

1. Data collection: Satellite and pyranometer data is taken from providers’ APIs

2. Data preparation: Pre-processing and filtering steps like unit conversions, resampling, and removing outliers.

3. Analysis: Two sub-regression analyses were used to compare providers, using two regression models:

  • Ordinary least square (OLS): Estimates relationships between variables using a straight line, helping us understand how one variable changes another
  • Linear mixed-effects (LME): Considers fixed and random effects, capturing variations and correlations. Ideal for complex data with different sources of variability

For our performance indicators, we used:

  • Root mean squared error (RMSE): The average difference between predicted and actually observed values, with a lower difference obviously being more desirable.
  • R²: How well the model explains data variations, here we want to see a higher score.

First-stage analysis

Initial results showed poor performance from Satellite 3, with high RMSE and low R² values.

Based on these results, we can rule it out from the strong contenders list.

 

Meanwhile, Satellite 2 also has slightly higher RMSE values than Satellite 1. But unlike Satellite 3, the difference wasn’t statistically significant and for that reason we have introduced an additional stage in the analysis to just compare Satellites 1 and 2.

 

Second-stage analysis

The RMSE values below, using both OLS and LME models, demonstrate Satellite 1 consistently has the lowest, most desirable RMSE.

 

Conclusions

Satellite Benchmarking was performed to select the satellite source that best predicts pyranometer(site) irradiation data. For this purpose, three different satellite sources and four different solar sites were used. Both OLS and Mixed Effect models were used in the analysis, and RMSE and R²: values were used to compare the performances of the models.

Based on the methodology described above we have been able to identify a single provider for satellite-based irradiation data that better describes the condition on the solar sites BayWa r.e. manages.

Back to top