[SOLVED] malloc fails after calling getOpenFileNames() with native file dialog
-
I'd like your help please, with a huge memory allocation that fails after calling getOpenFileNames(), but works if I do the same using the non-native Qt file dialog. Here's the shortest piece of code that generates a runtime error, because 400 MBytes could not be allocated:
#include <QApplication> #include <QFileDialog> int main(int argc, char *argv[]) { QApplication a(argc, argv); QStringList files = QFileDialog::getOpenFileNames(); char* x = new char[4*100*1000000]; // allocate 400MB return 0; }
I know that asking for a contiguous block of 400 MB is almost a bug in itself. However, the allocation succeeds if I either
- Don't call the getOpenFileNames() at all
- Call getOpenFileNames() with the QFileDialog:DontUseNativeDialog option
Question: Is there a way to use the Windows native file dialog and make the 400MB allocation succeed afterwards?
Notes:
- I cannot allocate before calling the file dialog, as the size of the allocation depends on the file(s) selected.
- Smaller allocations work, e.g. 100MB.
- I'm on Qt 5.5, Windows 7 64bit, the application is 32bit, mingw.
-
Hi,
You probably use up your entire stack/heap memory allocated to your program. Increase it if you need more memory. Calling native dialogs might just allocate less memory and thus leave you room to allocate the insane char buffer. -
Hi and welcome
a 32bit program on windows can max get 2GB default.
I guess the QFileDialog::getOpenFileNames() fragments memory a bit so
after the call, it is not possible to get contiguous block of 400 MB.
You could try with
QMAKE_LFLAGS += -Wl,--large-address-aware
and see if that allows you to get the block.
(note: never tried that wingw)Can I ask why you need such huge buffer?
-
Working with arrays of size so close to available memory for the process requires custom memory management. Which always is expensive in terms of efforts required.
And there are always files which are not going to fit anyway,
I would rather invest in handling files by parts. -
@mrjj
Hi, thank you for your response. The --large-address-aware linker option has fixed the issue for me. I understand my process now has 3GB of useable memory. This seems to be sufficient to allocate 400MB, even after the file dialog has probably fragmented the memory. I can now allocate even 800+ MB.I might still decide to avoid the native file dialog.
The huge buffer is for loading a huge image, which is then transferred as one chunk into GPU texture memory for display. This will obviously not work on every PC. As other people have suggested, the correct solution is to avoid such a large alloc. The code is just that way at the moment.
-
@Jeroentjehome Hi, thanks for your reply, increasing memory by using --large-address-aware has worked around my problem.
-
@alex_malyu Hi and thanks for replying, you're of course correct with your suggestion for better handling of that huge file. This will have to wait, though...