I am struggeling with the correct approach for delayed transmission, this is the way how I am doing it at the moment but it doesn’t work (the delay is way to much, multiple seconds instead of microsenconds):
The question I keep asking myself is how to properly add the delay of 3000 microseconds to the system time, because the system time is not in microseconds but in system ticks - how can I convert 3000µs to system ticks?
Best regards and thank you so much for any tips & hints
The system tick is at a fixed known rate. So all you need to do is multiply the time by the correct constant to give you the delay in ticks.
I use two constants in my code: #define TIMEUNITS_TO_US (1/(128 * 499.2)) #define US_TO_TIMEUNITS (128 * 499.2)
For distance calculations I then also define #define c_mPerS 299792458 #define c_mmPerTick (c_mPerS * TIMEUNITS_TO_US / 1000) #define c_mPerTick (c_mmPerTick/1000)
So multiply the time in us by US_TO_TIMEUNITS and you will get the time in DW ticks.
Keep in mind that the last 9 bits of the transmit time register will be forced to 0, so your transmit will be early by (calculated_time MOD 512)*TIMEUNITS_TO_US microseconds
If you are setting a transmit time based on the current clock this probably isn’t much of an issue, the exact transmit time clearly isn’t that critical. However it’s more normal to calculate a transmit time based on the last receive time rather than the current time. In that situation you normally want very accurate timing and those last 9 bits matter, you will normally want to include them in the message being sent so the other end can correct for the error.