I have implemented a custom driver. I am characterizing performance using the MDEK-1001 units, with factory firmware behavior and datasheet as baseline.
On Channel 5, at 6.8Mbps, PRF=64Mhz, PSR=64, PAC=8 asymmetric double-sided two-way ranging behaves just like the factory firmware - in so far as ~500 data points at fixed distances of 5, 15, and 25 meter line-of-sight have similar means and standard deviations (histograms show a Gaussian-like distribution, so I’m modeling as Gaussian).
Range is a priority in our application (and power is not, we are lucky in that regard at least…). I see expected tradeoffs when switching channels, PRF, PSR, and PAC parameters. As I muck with these parameters to enhance range the measurement noise is ~3 cm 1-sigma.
I am seeing an unexpected tradeoff when switching data rates. The 3 cm 1-sigma value is valid at 6.8 Mbps. Down-shifting data rate to 110 kbps pushes the noise out to ~7 cm 1-sigma. Does this suggest a problem with my driver, or am I missing a correlation between the payload data rate and the preamble tokens used for timing/ranging?