Measurement Noise (Repeatability) vs. Data Rate - Unexpected

I have implemented a custom driver. I am characterizing performance using the MDEK-1001 units, with factory firmware behavior and datasheet as baseline.

On Channel 5, at 6.8Mbps, PRF=64Mhz, PSR=64, PAC=8 asymmetric double-sided two-way ranging behaves just like the factory firmware - in so far as ~500 data points at fixed distances of 5, 15, and 25 meter line-of-sight have similar means and standard deviations (histograms show a Gaussian-like distribution, so I’m modeling as Gaussian).

Range is a priority in our application (and power is not, we are lucky in that regard at least…). I see expected tradeoffs when switching channels, PRF, PSR, and PAC parameters. As I muck with these parameters to enhance range the measurement noise is ~3 cm 1-sigma.

I am seeing an unexpected tradeoff when switching data rates. The 3 cm 1-sigma value is valid at 6.8 Mbps. Down-shifting data rate to 110 kbps pushes the noise out to ~7 cm 1-sigma. Does this suggest a problem with my driver, or am I missing a correlation between the payload data rate and the preamble tokens used for timing/ranging?


First off thanks for the details. Interesting results.

Of course the DW1000’s claim to fame is achieving 10cm accuracy. So you are well within that target.

But to explain what you are seeing. The data rate itself has no effect on the ranging performance. However switching to 110kb/s requires the use of a 64us SFD.

6.8Mb/s uses an 8us SFD. The SFD does add noise to the time-of-arrival estimate. That means that if the preamble length is kept constant then using the longer 64us SFD for 110kb/s would reduce the ranging performance.

Usually when switching to 110kb/s the length of the preamble is also increased. If the preamble is long (e.g 1024 symbols) then the longer SFD sequence will not really be noticed.

I hope this helps