Difference in MB Soil Moisture readings compared to Davis Envoy - missing temperature corrections?
Posted: Sun Jul 12, 2020 4:58 pm
Since March this year I have an MB Pro Red running in parallel with a Davis Envoy system. Before switching to solely using MB readings, I wanted to check for differences between the two systems, both using the same wireless signals from ISS and Soil/Leaf station.
Most delta's, if any, are small and usually due to slight changes in timing, rounding etc, but after 3 months of recording the MB Soil Moisture readings appeared to be up to 40% higher compared to the Davis Envoy system, see chart below.
The plot shows some systematic shifts in data so I plotted the ratio between MB and Davis readings in time.
The delta is becoming smaller with summer nearing. When adding Soil Temperature to the time plot there an inverse relationship visible with MB/Davis ratio. Which reminded me about a note in the Davis manual that soil temperature is used to correct the soil moisture readings: "If no soil temperature sensors are installed, the station will use the default temperature of 75ºF (24ºC) to compensate the soil moisture readings. The soil moisture accuracy will be off by about 1% for each 1ºF (0.5ºC) difference between the actual soil temperature and 75ºF (24ºC)."
In this case the temperature in March was down to 6 degr. C, meaning 18 degr. C = 32.4 degr. F below the Davis reference temperature and thus a 32% reduction applied as correction. That's for the MB /Davis ratio then 1/(100-32)*100 = 1.48 and thus rather close to the 40% difference noted in graphs above between MB and Davis readings of same raw wireless signal.
Is the Meteobridge missing such correction or am I overlooking something?
Most delta's, if any, are small and usually due to slight changes in timing, rounding etc, but after 3 months of recording the MB Soil Moisture readings appeared to be up to 40% higher compared to the Davis Envoy system, see chart below.
The plot shows some systematic shifts in data so I plotted the ratio between MB and Davis readings in time.
The delta is becoming smaller with summer nearing. When adding Soil Temperature to the time plot there an inverse relationship visible with MB/Davis ratio. Which reminded me about a note in the Davis manual that soil temperature is used to correct the soil moisture readings: "If no soil temperature sensors are installed, the station will use the default temperature of 75ºF (24ºC) to compensate the soil moisture readings. The soil moisture accuracy will be off by about 1% for each 1ºF (0.5ºC) difference between the actual soil temperature and 75ºF (24ºC)."
In this case the temperature in March was down to 6 degr. C, meaning 18 degr. C = 32.4 degr. F below the Davis reference temperature and thus a 32% reduction applied as correction. That's for the MB /Davis ratio then 1/(100-32)*100 = 1.48 and thus rather close to the 40% difference noted in graphs above between MB and Davis readings of same raw wireless signal.
Is the Meteobridge missing such correction or am I overlooking something?