Out of memory handling
-
Most modern day programs don't seem to bother too much about running out of memory. However, when using large data sets, you could run into this (2G footprint on 32bit systems). How do you manage this, if you manage it at all? Do you care if your program crashes from memory over-use? Do you try to stop it from crashing somehow?
-
Of course it should be handled if there is a big chance of oversizing (as you said 2gb at 32bit). But in most of applications (not taking scientific, finance or something other special things) it happens extremely rare and every such case can be easily protected from such problem.
-
Let's talk special here. We have an application that can consume up to 2G in 32 bit builds. It is a rather complex thing, and there is no absolute certainty where the bad_alloc will occur. I haven't been able to find an awful lot of information about it yet and since PCs have a crap load of memory these days, the issue is largely ignored. A lot of applications will indeed never even reach the 1G memory use (or the .5G for that matter). Would you think there are best practices in those rare cases?
-
Question : if you just don't have the required memory, can you do something to prevent the crash? Something like:
- send a warning to the user "you should save your data... memory shortage coming soon... crash following"
- release unused objects... is impossible in most of the cases
- manage by yourself an of the core strategy (hand made HD swap)
- ...
In most cases, you don't need to wait until the last kb is used. Allocation tests can be performed on big allocations and the turnaround strategy (if any) can be started when more than 80% is used.