QFile::size() returning incorrect value when reading through a LAN on Windows.
-
I have a class that opens a log file and every 500 ms checks if there are new lines to read.
The file is being read through LAN, something like this: "//192.168.1.102/shared/file.log"Sometimes the file gets cleared, in this case I have to reset the file pointer.
During a simple debug I've noticed that the pointer was being reset even though the file wasn't cleared.
Sometimes (m_file->pos() > m_file->size()) would randomly return true. And looking at the console it said that, for instance, pos was 15000 and size was 14500. Even though we've just read the new lines and the actual file size IS 15000. I don't have a clue of why size would 'lag' X_x.Also when reading through LAN, I've noticed that the new lines wasn't instantly read. In fact, they came in groups, suggesting that the file was being buffered. Maybe that's how Windows' file sharing works?
When debugging locally, both issues dissapeared (although it may be luck, since it occurs randomly). Unfortunately I need to support LAN.
The second issue doesn't really matter, but the first one does since it would report false information.I'm using Windows 7 for debugging, reading a file through LAN from a Windows Server 2008 system. I honestly have no clue of what is causing this. I will try modifying the code a bit so that it would always open and close the file on getNewLines. Maybe this would somehow force size to refresh o_O?
@
CLogWatcher::CLogWatcher(const QString & file, QObject *parent) :
QObject(parent)
{
m_file = new QFile(this);
m_timer = new QTimer(this);
m_timer->setInterval(500);
connect(m_timer, SIGNAL(timeout()), SLOT(getNewLines()));
m_fileName = file;
}bool CLogWatcher::start()
{
// [...] some validation routines has been removedm_file->setFileName(m_fileName); if (!m_file->open(QFile::ReadOnly)) { qWarning() << tr("Read error -> %1").arg(m_file->errorString()); return false; } // Seek to the end if (!m_file->seek(m_file->size())) { qWarning() << tr("Seek error -> ").arg(m_file->errorString()); m_file->close(); return false; } m_timer->start(); return true;
}
void CLogWatcher::stop()
{
// Close file
m_file->close();
m_timer->stop();
}int CLogWatcher::getNewLines()
{if (!m_file->isOpen()) { qWarning() << tr("LogWatcher: Log file is closed."); return -1; } // if this is 1, it means the log file has been cleared // and we have to start reading from the beggining if (m_file->pos() > m_file->size()) { qWarning() << tr("LogWatcher: File pointer was bigger than file size. -> %1/%2") .arg(m_file->pos()) .arg(m_file->size()); m_file->seek(0); } int lineCount = 0; while (!m_file->atEnd()) { QString line = m_file->readLine(); if (!line.trimmed().isEmpty()) { lineCount++; emit gotLine(line); } } return lineCount;
}
@ -
[quote author="Andre" date="1300030795"]Do you get the same issues if you map the shared folder to a network drive in Windows?[/quote]
I'll try that. The thing is that it's hard to debug since it happens randomly (or at least I don't know the trigger). Just been half an hour until it occurred again.
EDIT: Debugging as suggested. Mapping the shared folder to a Windows' Network Drive (Z:)
The buffer/flush issue is still there. Going to wait a few minutes to see if the size issue appears.
-
[quote author="ivan.todorovich" date="1300031823"]
[quote author="Andre" date="1300030795"]Do you get the same issues if you map the shared folder to a network drive in Windows?[/quote]I'll try that. The thing is that it's hard to debug since it happens randomly (or at least I don't know the trigger). Just been half an hour until it occurred again.[/quote]
Yeah, those are very hard to debug issues, I know. What I am wondering though, is if it is really a good idea to keep a file, that is being altered by another process and that you access over a network open for so long. Personally, I would think that that is a recepe for trouble. There are just too many things that can go wrong. Perhaps you could considder using a QFileSystemWatcher instead?
-
Ok, same thing.
size went out of sync using the network drive.
What's strange is that I was running 3 instances of the app:
Using Network Drive (Z:/file.log).
Using normal folder sharing (//192.168.1.102/shared/file.log).
Using local folder (Executing the app directly in the server).
Nº 1 failed six times. (seems even worst, probably but luck though)
Nº 2 failed two times. (although previous test show it can)
Nº 3 didn't failedSo, this is making me believe there is no trigger. (if it were, shouldn't it affect all instances at the same time?)
-
Just a thought... I assume it isn't an option to have a watcher app run locally on the server? That app could watch the logfile, and provide an interface to connect with over the network (perhaps using QxtRPCPeer or something like that?)
Might proof more reliable than directly accessing files over the network... I do realize that that would be a work-around, not a solution.
-
[quote author="Andre" date="1300032579"]
Yeah, those are very hard to debug issues, I know. What I am wondering though, is if it is really a good idea to keep a file, that is being altered by another process and that you access over a network open for so long. Personally, I would think that that is a recepe for trouble. There are just too many things that can go wrong. Perhaps you could considder using a QFileSystemWatcher instead?
[/quote]
Yes, you might be right. Although theoretically it should work..
I'll try using a QFileSystemWatcher and opening the file only as needed..I could even fix this doing something like this: (haven't test it though)
This way it will only parse lines on size change.@
int CLogWatcher::getNewLines()
{if (!m_file->isOpen()) { qWarning() << tr("LogWatcher: Log file is closed."); return -1; } // if this is 1, it means the log file has been cleared // and we have to start reading from the beggining if (m_file->pos() > m_file->size()) { qWarning() << tr("LogWatcher: File pointer was bigger than file size. -> %1/%2") .arg(m_file->pos()) .arg(m_file->size()); m_file->seek(0); }
static int oldSize;
if (m_file->size() > oldSize) {int lineCount = 0; while (!m_file->atEnd()) { QString line = m_file->readLine(); if (!line.trimmed().isEmpty()) { lineCount++; emit gotLine(line); } } oldSize = m_file->size();
}
return lineCount;
}
@Since QFile::atEnd() checks for buffered data to return, maybe I'm getting the lines even before the file size attr is updated. (?)
Sounds strange, but that's the only theory I have right now. -
[quote author="Andre" date="1300033834"]Just a thought... I assume it isn't an option to have a watcher app run locally on the server? That app could watch the logfile, and provide an interface to connect with over the network (perhaps using QxtRPCPeer or something like that?)
Might proof more reliable than directly accessing files over the network... I do realize that that would be a work-around, not a solution. [/quote]
Not an option. In this particular case, the point of using LAN file access it to leave the server intact.
In a future, I need to be able to support ftp also. -
[quote author="ivan.todorovich" date="1300034188"][quote author="Andre" date="1300033834"]Just a thought... I assume it isn't an option to have a watcher app run locally on the server? That app could watch the logfile, and provide an interface to connect with over the network (perhaps using QxtRPCPeer or something like that?)
Might proof more reliable than directly accessing files over the network... I do realize that that would be a work-around, not a solution. [/quote]
Not an option. In this particular case, the point of using LAN file access it to leave the server intact.
In a future, I need to be able to support ftp also.[/quote]OK, just tossing ideas around :)
-
[quote author="ivan.todorovich" date="1300034077"]
[...]
Since QFile::atEnd() checks for buffered data to return, maybe I'm getting the lines even before the file size attr is updated. (?)
Sounds strange, but that's the only theory I have right now.
[/quote]
Following this theory, I'm trying this:@
int CLogWatcher::getNewLines()
{if (!m_file->isOpen()) { qWarning() << tr("LogWatcher: Log file is closed."); return -1; } // if this is 1, it means the log file has been cleared // and we have to start reading from the beggining if (m_file->pos() > m_file->size()) { qWarning() << tr("LogWatcher: File pointer was bigger than file size. -> %1/%2") .arg(m_file->pos()) .arg(m_file->size()); m_file->seek(0); } int lineCount = 0; //while (!m_file->atEnd()) { while (m_file->pos() < m_file->size()) { QString line = m_file->readLine(); if (!line.trimmed().isEmpty()) { lineCount++; emit gotLine(line); } } return lineCount;
}
@So far so good
-
Bleh, same error.
And to declare it official - I HAVE NO CLUE OF WHY THIS IS HAPPENING >.<
How the hell can (m_file->pos() > m_file->size()) be TRUE if I only read on (m_file->pos() < m_file->size())
It makes no sense! (at least to me) :/ -
[quote author="Andre" date="1300040355"]Perhaps you should file a bugreport? [/quote]
Yes I will.. thanks for your help!
-
[quote author="Volker" date="1300056406"]I suspect it to be some caching problem on the OS level (could be some race condition too).[/quote]
I don't know what you mean by race condition but I agree on the OS caching. Although using the same algorithm works on PHP just fine (using filesize(), fopen(), feof() and fgets()) and on Ruby too (file(), filestats() to find the size, tell() for the cursor pos and readlines()), so maybe the framework should prevent this issue somehow.
Tomorrow I'll take a deeper look at QFile's source code to see if I can find something supicious (I doubt it though).
Also if the problem persists I'll try a different aproach. (Haven't tried QFileSystemWatcher as suggested by Andre yet..)
It's really annoying to debug, after the last modification it failed only ONCE, and then worked for 3-4 hours until I closed it without problems!