How is the leading edge threshold calculated?

Sub-Register 0x2E:0000 – LDE_THRESH contains the threshold used for leading edge detection which is based on the noise estimate and the LDE_CFG1 and LDE_CFG2 values.

How is this value arrived at?

LDE_CGF1 : PMULT is described as “factor by which the peak value of estimated noise is increased in order to set the threshold for first path searching” with a value of 3 = x1.5 but no mention of how this is calculated so there is no indication of how changing this is going to impact results.

LDE_CGF1 : NTM is described as “factor by which the observed noise level is multiplied to set the threshold for the LDE algorithm’s first path search”

Sub-Register 0x12:0000 - RX_FQUAL includes STD_NOISE “the standard deviation of the noise level seen during the LDE algorithm’s analysis of the accumulator data”

Which based on those descriptions would make me think that the threshold should be something along the lines of STD_NOISE * PMULT * NTM or STD_NOISENTM + PMULTPeakNoise (where peak noise is defined as ???)

And LDE_THRESH is indeed somewhere in that region but there doesn’t seem to be any real relationship between them. There are clearly some other values that are being factored into the calculation but I can’t see any mention of what they are in the documentation.

And then there is 0x2E:1806– LDE_CFG2 which for LOS should be 0x1607 or 0x0607 for 16 MHz or 64 MHz PRF or 0x0003 for NLOS but beyond that is completely undocumented.

2 Likes

Hi Andy,

I was also trying to investigate the LDE threshold. According to “Application note: DW1000 Metrics for Estimation of Non Line Of Sight Operating Conditions”, the threshold is calculated by “Standard Noise Deviation x NTM”. Therefore, I change the value of NTM, and read the Standard Deviation of Noise (STD_NOISE) and LDE Threshold (LDE_THRESH) from diagnostic data.

The results shows that there is certain relation between NTM, STD_NOISE, and LDE_THRESH. Specifically, when I set the NTM from 23 to 32, the ratio between LDE_THRESH and STD_NOISE is in accordance with NTM. For example, when I set the NTM value as 26 (11010 in binary), the LDE_THRESH / STD_NOISE value is around 27; as I increase the NTM value to 31 (11111 in binary), the LDE_THRESH / STD_NOISE value is around 32.

However, when NTM is lower than 23, the LDE_THRESH / STD_NOISE value remains unchanged. For example, even I set NTM as 0, the LDE_THRESH / STD_NOISE value was still around 23.

This really confused me, since according to the application note, by default the NTM is set as 12, but when I set the value as 12 (01100 in binary), the LDE_THRESH / STD_NOISE is around 23. I also tried to change the PMULT value, but I did not see any influence on the results.

Does anyone have any idea on this issue?