Giovanni wrote:I will update 20.3 with the latest changes and make those "official", anyway you should look into your delta setting, increasing delays is not the same thing. You may try a delta of 10 for example. With 32 bits timers you could get 2^32 delays like you suggested if you miss a deadline.
Thanks for looking at the backport! I'll start testing that as soon as it is available.
I had a couple of concerns about raising the delta. One is whether it will impact on using timer capture to accurately parse soft-serial protocols at baudrates with a bit timing smaller than the delta. I just tested that though with a delta of 50 and I seem to be able to still parse a 100kbaud SBUS signal, so I must have misunderstood the impact of delta on timer capture.
The 2nd concern is that raising the delta just seems to be papering over the issue, lowering the probability of a wrap event without really fixing it. The use of an empirical VT storm app to measure the right delta does seem to imply that it isn't a real fix.
New VTs have an important difference: Previously "delta" was also influenced by duration of callbacks, you needed to trim it on your application, now it is just function of CPU execution speed making it more reliable, in general much lower deltas are possible reducing jitter. i tested on a 170MHz G4 with a systick at 16MHz and a delta of 15, very reliable.
I'm glad it no longer depends on callback duration, but this still leaves me nervous. A smaller delta should increase total CPU load, but if it causes an actual slip then that would seem to be a bug. The only reason I can think of for a slip of a full timer period would be if we had an interrupt latency above the timer wrap time, which would be a very long ISR latency.
Cheers, Tridge