memory performance problems
-
Hi,
i'm working on an viewer for 3D files. But i have a memory performance problem.
As you can see on the image i run the VS diagnostic tool. Between the single states i opend different files. Before state 6 i just opened a very big file. But after i opened a smaller one (i already had opened before) the memory usage does not go back.
I cannot find any custom data i allocated by myself (as i did previous on a different project without using qt) so i don't know how to find out where i forgot to release the memory. When i run the program and close it even before i opened a single file there is a lot of memory left. Everything about Qt5Cored,...
So i wonder if i did something wrong, or is qt that inefficiently? -
On windows that diagnostic is painful and I never managed to run something satisfying. On linux there is Valgrind
The "nuclear bomb" solution is:
- explicitly declare all destructors and declare them virtual
- make sure every time you create a QObject with new you supply a parent
- do what the nutjobs in the C++ ISO committee are trying to convince us to do: convert any pointer that is responsible for lifeclycle management (other than those handled by Qt's parent-child so the QObjects) to a smart pointer (either the C++11 or Qt way)
-
Seams to be Qt specific problem
Every memory state is one break point.Any ideas??
-
I would rule out memory leaks in any of the seasoned classes of Qt until I could not find any other explanation.
I think the increase in memory here is just due to the dll load but I'm not technically proficient enough to tell for sure. conjuring the real brains here: @Moderators any insight?
-
Hi,
Qt allocate some stuff at startup to work properly hence the related memory consumption.
Also, you have to take into account that the OS doesn't necessarily return freed memory immediately.