Problem in QSerialport
-
What about using read(qint64 maxSize)? I can get one byte at a time due to which timing information between two bytes can be made possible.
QSerialPort is handling the serial ports in a way that when information available it is read and stored in a buffer (QIODevice). QSerialPort is giving you a signal readyRead that there is data ready to read. However, that signalling is typically not working on a byte to byte scheme. Check it out how many bytes are available to read when readyRead is triggered. My experience is that typically there are already a number of bytes available. For really slow comms you may receive a readyRead after each byte, but the 30 ms I consider as rather short and do not think that you are having a chance.
Certainly you can read byte by byte with the read function, but you are reading from the buffer. Therefore you are checking the speed of your application on how fast you can read byte by byte. However, I think you meant to check how often you are receiving a byte on the UART (serial port) and like to exclude bytes when received with too much delay (30 ms).
-
QSerialPort is handling the serial ports in a way that when information available it is read and stored in a buffer (QIODevice). QSerialPort is giving you a signal readyRead that there is data ready to read. However, that signalling is typically not working on a byte to byte scheme. Check it out how many bytes are available to read when readyRead is triggered. My experience is that typically there are already a number of bytes available. For really slow comms you may receive a readyRead after each byte, but the 30 ms I consider as rather short and do not think that you are having a chance.
Certainly you can read byte by byte with the read function, but you are reading from the buffer. Therefore you are checking the speed of your application on how fast you can read byte by byte. However, I think you meant to check how often you are receiving a byte on the UART (serial port) and like to exclude bytes when received with too much delay (30 ms).
-
Yes I need to discard the whole packet consisting of 8 bytes sent by the sensor if the delay between the first and the second byte received is more than 30 ms.
P.S : The sensor sends the bytes of data with 10 ms delay in normal condition.
The only conclusion feasible is IMHO to measure the timing indirectly, e.g. measuring the time past since last call to your slot function. When you have received 80 bytes in a tenth second, you know that there must be 8 byte every 10 ms. This assumes that only this information has been sent.
The problem is that the event loop might "hang" irregularly depending on your other tasks. You should measure the elapsed time between calls and base your decision for next steps on the outcome.
-
The only conclusion feasible is IMHO to measure the timing indirectly, e.g. measuring the time past since last call to your slot function. When you have received 80 bytes in a tenth second, you know that there must be 8 byte every 10 ms. This assumes that only this information has been sent.
The problem is that the event loop might "hang" irregularly depending on your other tasks. You should measure the elapsed time between calls and base your decision for next steps on the outcome.
There are some calculations, real time graph plotting involved based on the data received by the device, so It's relatively difficult to measure the timing as you said here, but still I'll give it a try for once.
In the meanwhile, do you think we can limit the buffersize such that somehow we can bypass this internal buffering, it is somewhat risky in terms of data loss though.
-
There are some calculations, real time graph plotting involved based on the data received by the device, so It's relatively difficult to measure the timing as you said here, but still I'll give it a try for once.
In the meanwhile, do you think we can limit the buffersize such that somehow we can bypass this internal buffering, it is somewhat risky in terms of data loss though.
You are receiving 800 Byte per second. I doubt that you can make the buffer size small enough.
However, measuring the time is basically an additional time reading each time the slot routine is entered. Certainly this costs additonal but should be a major burden, I would say. Also you can simply output the number of bytes you find time in the buffer. E.g. with
qDebug() << SerialPort->bytesAvailable();
More comfortable could be a time lapse as already suggested by @mostefa above.
-
Using timing to decide if data is good or bad doesn't sound like a good idea. As mentioned the RX buffers in QSerialPort are going to be a problem to start with. Serial is slow enough that you might be able to pull it off (this assumes the computer is running considerably faster and can deal with the serial data in, almost, real time). This approach doesn't sound reliable at all.
If I understand this properly data that is received within 10 ms of some event is good and 30 ms is to be rejected. I assume the 'event' is a command you send through the serial to read the sensor data and 10/30 ms is the amount of time it takes for the sensor to respond. If you are sending a command through serial why is the response at 30 ms no good where 10 ms is good?
If there is no command sent by you through serial but instead you are monitoring data where you are trying to synchronize to a timing pattern then maybe use a thread or timer which checks for RX data at regular intervals (say every 10 or 20 ms) and go from there. You might have to read the serial ports directly to bypass the QSerialPort buffer problem (maybe). I believe the hardware/driver also has a buffer that could also be a problem (?).
If it is a case where there is no response to a command you send, and the only way to tell is by the amount of quiet time, then wouldn't the next byte received (even 30 ms later) be garbage or something unrelated and couldn't you detect this by looking that the data itself? For example, if the sensor is reading temperature and the last reading was 25 C but the next reading was very different or some default value could this be used to throw out bad responses?
If this is a case where blocks of data are sent continuously but you are trying to look for one small part of the data then maybe trying to identify the start and end of the blocks would be a better idea. For example, you may have something that transmits in blocks of 64 bytes separated by something like '/r' or '/0' perhaps. If you read the data looking for the start of the blocks then your data of interest should always be some offset value from the start of the block (assuming the block size is always the same).
-
As an option, you need to send the timestamps from the device (together with the measured data). Then you can compare timestamps from received frames, e.g your frame can be like: <timestamp><data><\r>
-
As an option, you need to send the timestamps from the device (together with the measured data). Then you can compare timestamps from received frames, e.g your frame can be like: <timestamp><data><\r>
-
Using timing to decide if data is good or bad doesn't sound like a good idea. As mentioned the RX buffers in QSerialPort are going to be a problem to start with. Serial is slow enough that you might be able to pull it off (this assumes the computer is running considerably faster and can deal with the serial data in, almost, real time). This approach doesn't sound reliable at all.
If I understand this properly data that is received within 10 ms of some event is good and 30 ms is to be rejected. I assume the 'event' is a command you send through the serial to read the sensor data and 10/30 ms is the amount of time it takes for the sensor to respond. If you are sending a command through serial why is the response at 30 ms no good where 10 ms is good?
If there is no command sent by you through serial but instead you are monitoring data where you are trying to synchronize to a timing pattern then maybe use a thread or timer which checks for RX data at regular intervals (say every 10 or 20 ms) and go from there. You might have to read the serial ports directly to bypass the QSerialPort buffer problem (maybe). I believe the hardware/driver also has a buffer that could also be a problem (?).
If it is a case where there is no response to a command you send, and the only way to tell is by the amount of quiet time, then wouldn't the next byte received (even 30 ms later) be garbage or something unrelated and couldn't you detect this by looking that the data itself? For example, if the sensor is reading temperature and the last reading was 25 C but the next reading was very different or some default value could this be used to throw out bad responses?
If this is a case where blocks of data are sent continuously but you are trying to look for one small part of the data then maybe trying to identify the start and end of the blocks would be a better idea. For example, you may have something that transmits in blocks of 64 bytes separated by something like '/r' or '/0' perhaps. If you read the data looking for the start of the blocks then your data of interest should always be some offset value from the start of the block (assuming the block size is always the same).
Thanks, according to the communication protocol provided with the sensor, I have to send a command and accordingly I'll be getting continuous packets of data with different amount of delay between two successive bytes and i need to discard packets if they don't come according to the prescribed timing information.
-
You are receiving 800 Byte per second. I doubt that you can make the buffer size small enough.
However, measuring the time is basically an additional time reading each time the slot routine is entered. Certainly this costs additonal but should be a major burden, I would say. Also you can simply output the number of bytes you find time in the buffer. E.g. with
qDebug() << SerialPort->bytesAvailable();
More comfortable could be a time lapse as already suggested by @mostefa above.
I am not sure whether this code does my work (of discarding packets) or not as I am getting the waveform (with or without the discarded packets) but I tried this code and I am getting milliseconds value in 'diff'.
time1.start(); int diff=0; do{ nbf_byte=readSerial[1]; //read byte number 1 qDebug() << nbf_byte; diff = time1.elapsed(); }while(diff<=30); if(nbf_byte!=0) { //go ahead }else{ //discard }
-
I am not sure whether this code does my work (of discarding packets) or not as I am getting the waveform (with or without the discarded packets) but I tried this code and I am getting milliseconds value in 'diff'.
time1.start(); int diff=0; do{ nbf_byte=readSerial[1]; //read byte number 1 qDebug() << nbf_byte; diff = time1.elapsed(); }while(diff<=30); if(nbf_byte!=0) { //go ahead }else{ //discard }
What is readSerial ?
It looks like an array. This inplies that the data is already stored somewhere. Therefore, you are measuring the time to access this byte. For your code you are basically measuring the time to get always the same byte, to output that byte and time required to determine the time elapsed. Most likely the loop will run for ever.
You would need somthing along the lines of
class MySerialSupport { QSerialPort *SerialPort; QElapsedTimer *Timer; public: MySerialSupport() { SerialPort = new QSerialPort; // probably some more stuff required here connect (SerialPort, &QSerialPort::readyRead(), this, &MySerialSupport::sltReadyRead); Timer = new ElapsedTimer; Timer->start(); } slots: void sltReadyRead() };
void MySerailSupport::sltReadyRead() { qDebug() << Timer->restart() << " " << SerialPort->bytesAvailable(); // handle reading serail port }
Every time readyRead signal issued you will entered the slot and it will give you the time since last call. If you are lucky it will provide each time a small amount of bytes. Whenever the event loop was blocked for a while you will see this as a larger time elapsed value. If this is combined with a larger amount of bytes you may conclude that the event loop had been blocked (even you are not completely sure about this).
Note: code fractions above are brain to keyboard and requires testing.
-
What is readSerial ?
It looks like an array. This inplies that the data is already stored somewhere. Therefore, you are measuring the time to access this byte. For your code you are basically measuring the time to get always the same byte, to output that byte and time required to determine the time elapsed. Most likely the loop will run for ever.
You would need somthing along the lines of
class MySerialSupport { QSerialPort *SerialPort; QElapsedTimer *Timer; public: MySerialSupport() { SerialPort = new QSerialPort; // probably some more stuff required here connect (SerialPort, &QSerialPort::readyRead(), this, &MySerialSupport::sltReadyRead); Timer = new ElapsedTimer; Timer->start(); } slots: void sltReadyRead() };
void MySerailSupport::sltReadyRead() { qDebug() << Timer->restart() << " " << SerialPort->bytesAvailable(); // handle reading serail port }
Every time readyRead signal issued you will entered the slot and it will give you the time since last call. If you are lucky it will provide each time a small amount of bytes. Whenever the event loop was blocked for a while you will see this as a larger time elapsed value. If this is combined with a larger amount of bytes you may conclude that the event loop had been blocked (even you are not completely sure about this).
Note: code fractions above are brain to keyboard and requires testing.
Thanks, my previous code of do while was delaying the data by 30 ms.
Now, I am getting the timing information since last call as per your suggestion but I am not sure whether due to the sensor or the GUI, the time elapsed since last call is linearly increasing as the time progresses.
Here is the initial timing information
While here is the timing information after 2 hours
-
Thanks, my previous code of do while was delaying the data by 30 ms.
Now, I am getting the timing information since last call as per your suggestion but I am not sure whether due to the sensor or the GUI, the time elapsed since last call is linearly increasing as the time progresses.
Here is the initial timing information
While here is the timing information after 2 hours
My guess is that your GUI is causing the delay. It is probably your real-time plotting lasting longer after each step.
From the numbers you are presenting I would certainly study the behaviour there in more detail. E.g. you can measure in a similar manner the time required within your plot routine.I assume that you are reading all bytes at once when you are in the routine triggered by readyRead. E.g. When you are reading only the complete 8 byte number before every plot processing there is also a possibility for a build up when the plotting takes almost as long as the interval between new data events is. The readyRead signal does not have a reminder behaviour indicating that there is still data in there. It indicates when new data has been received.
-
My guess is that your GUI is causing the delay. It is probably your real-time plotting lasting longer after each step.
From the numbers you are presenting I would certainly study the behaviour there in more detail. E.g. you can measure in a similar manner the time required within your plot routine.I assume that you are reading all bytes at once when you are in the routine triggered by readyRead. E.g. When you are reading only the complete 8 byte number before every plot processing there is also a possibility for a build up when the plotting takes almost as long as the interval between new data events is. The readyRead signal does not have a reminder behaviour indicating that there is still data in there. It indicates when new data has been received.
Great, your guess is completely right.
I have skipped to plot graph and now timing is perfect and constant
I am using qcustomplot for real time data plotting as described here .
Here one thing to notice is even if I replace the real time sensor data with a constant value to be plotted on the graph, the GUI starts delaying the timing information progressively. Hence the function of the graph described here is delaying the GUI with or without sensor data.
-
Great, your guess is completely right.
I have skipped to plot graph and now timing is perfect and constant
I am using qcustomplot for real time data plotting as described here .
Here one thing to notice is even if I replace the real time sensor data with a constant value to be plotted on the graph, the GUI starts delaying the timing information progressively. Hence the function of the graph described here is delaying the GUI with or without sensor data.
I never used that SW. Therefore this is more crystalball reading. It is typical that such application has an increased CPU consumption over time.
The question is if you have to update every 10 ms. 100 Hz changes nobody can see. Comparing to update rate of film you are save to update the plot every fifth data update, possibly every 10th is fine too. This considers the update rate only.
When the data is slowly changing without large noise, you may want to update only every 10th data point or some other interval.
However, those are expert decisions you have to make.
-
I never used that SW. Therefore this is more crystalball reading. It is typical that such application has an increased CPU consumption over time.
The question is if you have to update every 10 ms. 100 Hz changes nobody can see. Comparing to update rate of film you are save to update the plot every fifth data update, possibly every 10th is fine too. This considers the update rate only.
When the data is slowly changing without large noise, you may want to update only every 10th data point or some other interval.
However, those are expert decisions you have to make.
This is not feasible here as the sensor provides critical information of patients under observation, however averaging looks to be a viable option here.
Nevertheless coming onto the timing information, Thanks for all of your help, the indirect method is able to fetch the timing information of the whole packet (though unable to fetch the time for single bytes) which is quite helpful in discarding the bad packets of data.
There is some provision needed in QT either by bypassing the internal buffer or some other smarter way such that these kind of issues can be tackled in a direct way in future.
-
This is not feasible here as the sensor provides critical information of patients under observation, however averaging looks to be a viable option here.
Nevertheless coming onto the timing information, Thanks for all of your help, the indirect method is able to fetch the timing information of the whole packet (though unable to fetch the time for single bytes) which is quite helpful in discarding the bad packets of data.
There is some provision needed in QT either by bypassing the internal buffer or some other smarter way such that these kind of issues can be tackled in a direct way in future.
@sush "critical information of patients under observation" - using a protocol which verifies data validity based on some timing measurements sounds really strange! I hope my life will never depend on such technology. Really, if you write software for such critical stuff you should think about a reliable protocol and not mess around with buffers and time measurements.
-
@sush "critical information of patients under observation" - using a protocol which verifies data validity based on some timing measurements sounds really strange! I hope my life will never depend on such technology. Really, if you write software for such critical stuff you should think about a reliable protocol and not mess around with buffers and time measurements.
-
Heh, the "Problem in SerialPort" in this context sounds a strange. The main problem is in your approach. You are trying to cross a hedgehog (a real-time sensor) and a snake (a non real-time OS), which is impossible in principle. And your problem is not in trying to "avoid of a buffering".
At least, you should to use an external MCU, connected to the sensor to handle a data stream, as it is the "true" real-time. And then, you should to connect this MCU to the PC... This approach is better...
PS: I too afraid of your medical equipment. Could you please say a model/vendor of your equipment to avoid to use it by us? :)
-
Yes I understand, but the packet of data to be discarded on the basis of timing information comes into play very rarely like once in a millionth data sometimes. So your life would be perfectly safe with our system.
@sush "timing information comes into play very rarely like once in a millionth data sometimes" - oh, understand, then try to explain this to people which lives are going to depend on this! You know: things which can go wrong will go wrong (something a programmer learns after some time). So, your argument isn't an argument if it comes to such critical systems (it's fine for a whether app though).