Ranging accuracy vs received signal level

I have a question concerning correction of timestamps (ranging accuracy vs received signal level of APS011).

In practice, the reported ranging distance can change of ~20-30 cm depending on the received signal level.

I was thinking if the correction value could be applied at run-time depending on the measured received power (e.g., looking at the first path), differently from the decaranging-like
implementation using the actual ranging as table index.

I’m not sure if I understand what you are saying.

Are you asking if you can use the measured signal level as the input to your correction table? So use Table 2 from APS011 to correct the range rather than Table 1?
Yes you can.

However if you have a line of sight environment then (in theory) there will be a defined relationship between range and signal power so either of them will be equally good as a table index. Since you’re already calculating range why not use that rather than calculating an extra number.

If you don’t have line of sight then signal power is certainly a better option but then if you don’t have line of sight you probably have larger errors to worry about.

The other option is to create an equation that gives you an approximation to the range bias curve and calculate the value rather than have a lookup table.

It all comes down to trade offs between speed, size, complexity and accuracy.