Unsolved QSerialPort - Auto Baud Rate Detection
webzoid last edited by
Could anyone please suggest any strategies for
QSerialPortautomatic baud rate detection? All sensors I communicate with via the serial port use either an ASCII text or ASCII hex protocol.
Would it be valid to read a buffer of say 10 characters and check to see whether ALL characters fall within the standard ASCII range? Could there be issues with doing things this way? In my experience, incorrect baud rates usually lead to spurious characters being receive which are way beyond the ASCII range.
Is there anything which could be done with timing of received characters using
QSerialPortwhich would be more accurate?
It is not an simple issue (IMHO), because in addition of different baud rates, the remote sensor can communicate with the different parity settings too. So, you need to pass both properties: as a bauds and as a parities to parse an each of received package to validity.
Maybe your sensor should send a some (SYNC) patterns, which are sensitive to the framing && parity errors. I meant, which will be corrupted when a baud rate or a parity is wrong on 100%.
webzoid last edited by
@kuzulis Thanks for your reply. A lot of the sensors used with the application will be GPS (or NMEA outputting) sensors so I can already assume what information will be output (ASCII text).
The other sensors I interface with just produce ASCII hex strings - similar to NMEA but with more hex-wrapped data. Either way, it wouldn't be possible to modify any hardware to include any additional sync patterns but ALL sensors already output a '$' sync character.
Parity is a good observation. I guess there would be a possibility to include baud-rate tests with parity on/off.
QSerialPort::error()function not give anything meaningful upon incorrect baud-rate/parity/stop/start bits?
Would the QSerialPort::error() function not give anything meaningful upon incorrect baud-rate/parity/stop/start bits?
Previously, there (in QSP) we tried to implement the Frame/Parity errors. But a problem is that it is too complicates a code, besides, it does not work on all devices (because some chips, drivers, platforms and so on, just does not implements internally a functionality to detecting of this errors) or it works differently. So, now these errors is deprecated and are not implemented at all.
My approach would be to send a command and simply look for an expected response and if you don't get an expected response the baud rate is wrong (may be other reasons too). The sequence to find the baud rate would be to start at the lowest and work up to the highest until you get an expected response to a specific command. To be efficient (because this process is slow) store any found baud rate so that the next time the program runs it defaults to what was used last time (where 'Auto' would be the default if no previous setting exists).
The hardware and low level drivers of the OS should handle things like framing or parity errors (?). I don't know if the data is available to inspect if you get this kind of error.
I don't think it is necessary to send special sequences or bit patterns. It depends on the situation I suppose.