GPT interrupt handling in AVR taking too long

ChibiOS public support forum for topics related to the Atmel AVR family of micro-controllers.

Moderators: utzig, tfAteba

orion
Posts: 17
Joined: Sun Nov 12, 2017 7:56 pm
Has thanked: 4 times
Been thanked: 3 times

GPT interrupt handling in AVR taking too long

Postby orion » Tue Nov 28, 2017 10:15 pm

Hi folks,

I've been reading the code for the GPT module for the AVR port. This is one of the modules using timers in an ATMega chip. This module takes a frequency and a period parameter. From what I understand, the first parameter sets the frequency in Hz for the timer clock and the second the number of clock pulses the time interval provided by the GPT takes. The problem is that the timer clock prescaler in the AVR timers can only take on a small number of values (7 for timer2 and 5 for the others). The current GPT implementation tries to make the frequency values more flexible by combining the few prescaler values with the compare-and-reset register (OCRA). So, for example, if you want a 10 kHz frequency, and if the F_CPU is 16 MHz, you can choose a prescaler value of 64 and set OCRA to 24 = 25 - 1, for example. This will make the frequency of the match-reset event to be 10 kHz and that would be the frequency of the GPT timer. This forces, however, the implementation to update the period counter in an ISR, which is not a good thing, IMO, since you have the time to save all the registers used and the time to do the processing. Looking at the assembly listing, I suspected this could take more than 5 μs, and so I decided to set the pin at the beginning of the ISR and reset it at the end (done in the file hal_gpt_lld.c) to measure the time spent there. I took the testhal application for the GPT module in AVR port and used a frequency of 10 kHz and a period of 500 clock pulses, with the timer callback toggling the value of another digital pin. I burned the program in an Arduino Nano and measured the output of the pin used by the ISR with a scope. From there we see the ISR handling is almost 10 μs! That means the maximum frequency would be 100 kHz. If we make the frequency as 50 kHz and measure the output, we see the ISR handling is taking about 50% of the cpu time.

We can do better but either we forgo the flexibility on the frequency values, and settle down for the few allowed prescaler values, or we delay the clock initialization to the moment when we know both the frequency and the period, i.e, when calling gptStartTimer().

Would there be an interest in this kind of optimization? If so, which alternative seems more suitable?

I can try to implement it, if there is interest.

Thanks.
Attachments
GPT_isr_freq_50kHz.png
ISR handling time for f=50kHz
GPT_isr_freq_10kHz.png
ISR handling time for f=10kHz

User avatar
tfAteba
Posts: 547
Joined: Fri Oct 16, 2015 11:03 pm
Location: Strasbourg, France
Has thanked: 91 times
Been thanked: 48 times

Re: GPT interrupt handling in AVR taking too long

Postby tfAteba » Thu Dec 21, 2017 3:09 pm

Hi Orion,

You can propose and improve the code, any contribution is welcome.

It could be interesting for other people too :D

Thanks.
regards,

Theo.

orion
Posts: 17
Joined: Sun Nov 12, 2017 7:56 pm
Has thanked: 4 times
Been thanked: 3 times

Re: GPT interrupt handling in AVR taking too long

Postby orion » Fri Dec 22, 2017 7:04 pm

OK, I'll finish working on it and upload the code.


Return to “AVR Support”

Who is online

Users browsing this forum: No registered users and 11 guests