Inconsistency between DW1000 user manual and driver on how to get temperature

I’m trying to get the temperature from the DW1000, and I have noticed an inconsistency between the documentation provided for the method dwt_readtempvbat in the “DW1000 Application Programming Interface v 2.14”, found here, and the DW1000 User Manual, found here, on page 159 in the section titled “Description of fields within Sub-Register 0x2A:03 – TC_SARL”.

The inline documentation for the function in the driver states:

Note on Temperature: the temperature value needs to be converted to give the real temperature
the formula is: 1.13 * reading - 113.0

Whereas the DW1000 User Manual states:

The value can be converted to an actual voltage by employing the formula:
Temperature (°C )= ( (SAR_LTEMP – OTP_READ(Vtemp @ 23°C) ) x 1.14) + 23

These two methods of converting a raw reading to a temperature result in significantly different readings. In my case, the difference between the two methods results in a difference of over 10 degrees Celsius.

So which of these two is correct?

What do I need to do to get an answer? I’ve posted here and also sent my question via the form here.

There are two differences there - a scale factor and and an offset.

The user manual method is using a device specific factory set offset, the driver is using a hard coded fixed value. In that situation I’d say that going with the device specific factory set value is going to be better.

If (SAR_LTEMP – OTP_READ(Vtemp @ 23°C) gives you a value of 0 for 23 C then scaling it by 1.13 or 1.14 is only going to have an impact of around 1/2 a degree over the operating range.
Given no simple way of picking which is correct since the manual has been updated more recently than the drivers I’d say go with the number in the manual.

Basically go with the user manual formula, using device specific calibration data is fundamentally better for the offset and the scale factor differences aren’t enough to worry about either way.

1 Like