BOM Versus Actual Weather Accuracy Assessment

We’ve built a streamlined solution to evaluate how closely real-world weather data aligns with BOM forecasts. This system retrieves key metrics (like Temperature, Humidity, and Wind Speed) from BOM database, calculates daily or monthly variations, and translates them into easy-to-read accuracy percentages. With color-coded differences, season-by-season performance, and long-term trend tracking, our tool offers a clear picture of weather forecast reliability at a glance.
BOM Versus Actual

BOM vs Dayboro.AU Forecast Accuracy Comparison

Month Temperature (%) High Temp (%) Low Temp (%) Wind Speed (%) Max Gust (%) Precipitation (%) UV Index (%)
March 2025 97% 81% 76% 0% 100% 0% 52%
February 2025 90% 65% 45% 57% 100% 0% 71%

Statistical Summary

Overall Performance

Best Month February 2025 71% accuracy
Worst Month March 2025 68% accuracy

Seasonal Accuracy

Autumn 68%
Summer 71%

Best Performance by Metric

Temperature March 2025 97%
High Temperature March 2025 81%
Low Temperature March 2025 76%
Wind Speed February 2025 57%
Maximum Gust March 2025 100%
UV Index February 2025 71%

How do we calculate?

For transparency, below are the rules we use for calculation. We use the same calculations to validate the weather predictions made by the BOM and compare them against the data recorded here at Dayboro

We compare our forecasts against those created on-site in Dayboro, AU. See the post Dayboro.au forecasts versus reality. 

How Weather Accuracy Statistics Are Calculated

Here’s an explanation of how the numbers and statistics in the Weather Accuracy System are calculated, including the formulas used.

Basic Difference Calculation

The foundation of the accuracy assessment is calculating the difference between forecast values and actual measurements.

For example, with temperature:

				
					Temperature_difference = Actual_temperature - Forecast_temperature
				
			

This is done for multiple weather metrics:

    • High temperature
    • Low temperature
    • Average temperature (derived from high and low)
    • Wind speed
    • Maximum gust
    • Precipitation
    • Relative humidity (when available)
    • UV index (when available)
 

Accuracy Percentage Calculation

Raw differences are converted to accuracy percentages using this formula:

				
					Accuracy_percentage = (1 - (|difference| / max_tolerable)) × 100%
				
			

Where:

    • difference is the absolute value of the difference
    • max_tolerable is set to 5.0 for all metrics, representing the maximum acceptable difference
    • If difference exceeds max_tolerable, accuracy is set to 0%

In code, this is implemented as:

				
					function weather_accuracy_difference_to_accuracy($diff) {
    $abs_diff = abs(floatval($diff));
    $max_tolerable = 5.0;
    if ($abs_diff >= $max_tolerable) {
        return 0;
    }
    return round((1 - ($abs_diff / $max_tolerable)) * 100);
}
				
			

This creates a linear scale where:


    • Perfect prediction (difference = 0) yields 100% accuracy
    • Small differences (under 1) yield high accuracy (>80%)
    • Moderate differences (2-4) yield medium accuracy (20-60%)
    • Large differences (≥5) yield 0% accuracy
 

Daily to Monthly Aggregation

For monthly views, daily differences are averaged:

				
					Monthly_average_difference = Sum_of_daily_differences / Number_of_days

Monthly_accuracy = weather_accuracy_difference_to_accuracy(Monthly_average_difference)
				
			

Overall Accuracy Calculation

The “Overall Accuracy” for a month combines all available metrics:

				
					Overall_accuracy = (Temperature_acc + Hi_Temp_acc + Low_Temp_acc + Wind_Spd_acc + TotPrcp_acc + Max_Gust_acc + RelHum_acc + UV_Index_acc) / Number_of_metrics
				
			

Seasonal Averages

Seasonal averages group months by season and average their overall accuracy scores:

				
					Season_accuracy = Sum_of_monthly_overall_accuracies / Number_of_months
				
			

For the Southern Hemisphere:

    • Summer: December, January, February
    • Autumn: March, April, May
    • Winter: June, July, August
    • Spring: September, October, November
 

Year-over-Year Trends

These calculate how accuracy has changed between years:

				
					Year_trend = Current_year_accuracy - Previous_year_accuracy
				
			

A positive trend shows improvement, while a negative trend indicates declining accuracy.

Colour Coding

Difference values are color-coded for visual assessment:

    • Green: |difference| ≤ 2 (small differences)
    • Orange: 2 < |difference| ≤ 4 (moderate differences)
    • Red: |difference| > 4 (large differences)
 

Best/Worst Performance

These statistics identify the months with the highest and lowest overall accuracy scores, helping to identify patterns in forecast performance.

BOM Forecast Versus Actual Recorded

Share:

More Posts