Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct
QFile and QTextStream on RPI don´t write to File
I really have no idea what those variables refer to. I was asking about version of the related software. As for packages, you should only need QtCore to work with files (and streams), so no, it doesn't appear for you to have anything missing. Currently my best guess is some kind of buffering either by the kernel or the filesystem ...
ok these variables are out of the buildroot defconfig ..
the kernel wich is linked is version 4.1.
and qt5 is the buildroot package wich contains the qtCore.
After some more tests i think it could be a performance problems. Cause when i let the programm run for 1 min or longer after the function should write to file i works well.
so 12 out of 15 messages were written.
When i run the same programm on my linux PC everything works perfectly.
Most linuxes (ext* filesystems) will defer flushing the journal (especially if the hardware is slow), so my previous supposition stands and it appears that might be the case here. What's the filesystem on your device?
it´s an ext 4
You could try passing
QIODevice::Unbufferedwhen opening the file and see if that makes a difference, although I wouldn't rely on that flag for production code.
I already tried the Unbuffered flag, makes nearly no difference for me. Also tried to flush the stream after writing with same result.
But tanks a lot for your helpt anyway
Also tried to flush the stream after writing with same result.
This wouldn't work, as the Qt buffers are flushed implicitly when you invoke
endlor close the file.
I already tried the Unbuffered flag, makes nearly no difference for me.
Well, the only other thing I could think of is to try and play a bit with the filesystem's flags, for example (assuming you have SSD storage on the device) passing
journal_async_commitoption when mounting might make a difference.
The rpi compute module uses a eMMc flash storage. Do you have any suggestions about changing flags for that?
Cause i dont know much about this stuff.
Just realised that when i open the files in vim .. at every postion he had failed to write. the file has a line of
but i only can see them in vim .. text editor says that file has no common codec .. and when i cat the file it shows only the "good" entries.
Hi and welcome to devnet,
You should also add an explicit error message for when the file fails to open. You might get some clues about what is happening.
Hi and thanks,
I have allready add a debug message on file.open, which never has shown an error . And i message the stream status when the file is open after writing, and QTextStream status() always return 0 for :
QTextStream::Ok 0 The text stream is operating normally.
So the file is always opened correctly and QTextStream seems to write correctly.
Because as i mentioned before, when it doesnt save the right message the file as the same among of chars in the file but it looks like
So as kshegunov mentioned, i looks like the buffer has not been writen to the file ..
Maybe it is because i never close my programm proberly or shutdown the system correctly and i do not have the possibility to do so.
Why can't you shutdown your system properly ?
If you indeed always "pull the plug" that won't help keep your system clean and not only the files you write but the rest also.
What is your exact use case ?
my device is used as a controller and interface for an small kind of electric vehicle. Which is controlled via CAN Bus.
And the the system is turned on and off via one button which is also the emergency off button, so it cuts of the battery completly.
How fast and often will you be writing these files ?
How large should they become ?
Also, should they both survive a hard restart ?
These files were opened at every system startup and they are logging the initialisation of Can devices and some serial an i2c devices , as well as the log the rfid key which was used to unlock the machine.
So yes they should survive cause i would need them for debuging .
So basically you need a read-only system but be able to write in some files.
Thanks for that .
So i have now a second partiton for log files wich is read/write and one for filesystem.
Should make the system much safer.
I fear the system will never be really "safe", as killing the application and even worse - the OS in such a notorious way is bound to have side-effects. OSes and filesystems are not made to be switched off at arbitrary times as a normal exiting sequence, albeit they do feature a number of measures to try and minimize the implications of sudden power loss. All that said it can't be guaranteed no matter how you partition the disk or how you play with the fs flags that the data will be written (or not written). The best thing I could suggest at this time is to research if it's possible to run ext4 in unbuffered mode, it might be slower, but at least you should have (most) of the data in the files.
In addition to what @kshegunov proposed, a system running on read-only filesystem with RAM disk used for the parts that needs to be writable but not stored in long term could be an option.
I have now found a solution where i shutdown my system properly after getting a Can Message from my controller, and added an external power suply which stays 4 secs longer than the rest of the system . So that is enough time to shutdown.
Everything works perfect now.
Thanks a lot