Still very new here, so far I've only used the AVR port of ChibiOS on a controller embedded into one of my personal robotics projects. Yes, I'll be upgrading to something more powerful in the future, but it's in there, it's powerful enough for what I currently need it to do, etc.
Anyways,
The AVR board (an Arduino Uno R3) communicates with the host computer via its UART over the integrated USB<->UART adapter. The protocol I wrote for it uses COBS/R as the framing format. The only serial/uart driver implemented for the AVR port is the Serial driver, so the receive side is essentially just a thread that calls the following in a loop:
Code: Select all
auto byte = chnGetTimeout(channel, TIME_INFINITE);
Obviously the channel driver is way overkill for this, and ideally I'd like to manage my own ring buffer so that I can do the COBS decoding in the ISR and avoid waking up a packet handling thread for every byte, only waking it up when a framing marker is received.
I'm still trying to understand the difference between the use cases for the SIO driver and the UART driver. Both seem to be intended to interact directly with the underlying peripheral hardware, using the integrated FIFO and potentially DMA on platforms with those features. The ATmega of course has neither. There's technically a single byte receive buffer, but no transmit buffer.
So my question is, if I'm going to try my hand at implementing a new driver for the internal UART, in 2025 should I be implementing a UART driver or an SIO driver? Also, with respect to the fact that the AVR doesn't have FIFOs or DMA...