Author Topic: Memmory leaks  (Read 21708 times)

Luca

  • Guest
Memmory leaks
« Reply #15 on: July 16, 2005, 02:56:26 pm »
I would like to come back to the original topic of this thread, i.e. memory leaks (even if I will say my opinion on memory management systems later, maybe).

The fact that, when the main window of an application is minimized, the memory consumption falls to 0 DOES NOT MEAN that there aren't memory leaks. Indeed, quite the opposite is true!

What happens when a user minimizes a window? Windows thinks "Oh well, this application won't be used by the user for some time, especially for what concerns its GUI, maybe it's a good time for making some order", and consequently swaps out the whole application from memory. On the task manager you can see two columns: "Memory usage" (the amount of RAM used by the application) and "Virtual memory size" (it should be the TOTAL memory allocated by the application, which may reside on the disk). Now, when you minimize the app, then the memory usage falls to 0, while the VM does not vary.

After you minimize an application, the application, in order to work (in the background) will immediately make some memory accesses: some page faults will occurr, and a (hopefully small) part of the application will be brought back to memory.

Now, what happens if there is a memory leak? Leaked memory is, by definition, memory that the programmer forgot to free, and that now stays unused in the virtual memory of your application; allocated yes, but unused. Consequently, when Windows swaps it out (for example due to a minimizing, or because there is not enough room in RAM), it will never be used anymore, that is it will never cause any further page fault, that is it will never be brought back to RAM again.

Thus, after the minimization, the memory usage will remain small (unless new memory leaks occur), but the VM will remain large (because in the VM there is still all the leaked memory!!!)

My suggestions are the following ones:
1. Check the VM on file manager: it should be large even after the minimization, confirming the memory leak;
2. Try to solve the memory leak!!!

I Hope I managed to be clear...
Regards, and thank you for this wonderful software,
Luca

Luca

  • Guest
Memmory leaks
« Reply #16 on: July 16, 2005, 03:41:07 pm »
And now, some considerations about the OS issue.

I agree with kagerato, there is not an optimal strategy for memory management, even from a theoretical point of view (as long as you are using a page replacement algorithm like LRU, or FIFO, you are already optimal!)

What an OS can do is to implement some clever "tricks" to improve the performance of the paging algorithm in practice.

What Windows does is, IMO, clever and effective: it tries to be "stingy", i.e. it tries to give to an application the minimal amount of RAM that will allow it to run smoothly,  i.e. without incurring in too many page faults. This is implemented using (at least) two mechanisms:

1. From time to time, during the execution of an application, Windows reduces the RAM available to the application by a little quantity. If the application was using more RAM than its real need, we are happy because Windows has recovered some memory that can be useful elsewhere. On the contrary, if the application needed all the RAM it was using, then now it doesn't have enough memory to perform its computation efficiently, and must continuously swap to disk: a huge number of page fault occurs. Windows dectects this situation, and gives back some memory to the application, and we are happy again. Technically, the subset of the virtual memory of an application that is currently been used actively is called "WORKING SET".

2. When you minimize an application, Windows pages it out completely to detect its new working set. This is done for two reasons: (i) when you minimize a program, you won't use its GUI, thus some of the memory used by the application's GUI will not be part of its new working set, as long as the app remains minimized; (ii) when you minimize some applications, you'll probably won't use them for a while (unless they're doing some computations in background), and so they won't make many memory accesses at all). Thus, again, it is VERY reasonable to swap out the application; the worst thing that can happen is that it will be swapped in again immediately.

And here it comes another important fact. Paging out some memory does not imply making a disk access immediately. In fact, simplifying a bit, now the swapped-out memory page will be considered "free memory" by Windows; and Windows will use it as disk cache. But, as a page of disk cache, it will immediately contain a useful information, i.e. the swapped-out page. Thus, if it happens that the swapped-out page is indeed useful for the application, it can be swapped-in again WITHOUT ANY DISK ACCESS!

Pictorically, here is the path for a memory page:
APPLICATION MEMORY (in RAM) <--> DISK CACHE (in RAM) <--> VIRTUAL MEMORY (on DISK)

And this is why, when you minimize an app, a very few number of DISK I/Os occur! Try to minimize Firefox (which is a monster for memory occupation) and see what happens to your disk and to the memory used in Task Manager...

My conclusions: I think that memory management is well done in NT. I have been using Linux kernel 2.4.x, and the system easily went thrashing as soon as the total virtual memory exceeded my system RAM; this happened far less often in Windows. (I don't know what happens in Linux 2.6.x, 'cause my new system has quite a lot of RAM). My feeling was confirmed by a scientific paper I read once (I don't remember where I found it).

This DOES NOT MEAN that "Windows is better than Linux": there are other basic functions where Linux outperforms Windows in my own experience (readings large file from disk, for example).

Greets,
Luca

Offline kagerato

  • Multiple posting newcomer
  • *
  • Posts: 56
    • kagerato.net
Memmory leaks
« Reply #17 on: July 16, 2005, 07:03:15 pm »
Quote from: thomas
Here you prove me right, because this is the *exact* reason why deliberately dropping pages is evil.


You're still stuck on a non-sequitur, friend.  That is ultimately irrelevant, though; no matter the reasoning or evidence you would not be able to objectively demonstrate that one philosophy or method is better than the other.  In essence, what you are trying to prove is that one computer program (or at least, one part of it) is written superior to another.

This is the nature of arguing opinions; you present what you know and organize theories around your experience.  When a seperate body of evidence contradicts your standing, it becomes necessary to introduce newer (often more accurate and less broad) conclusions.  The only conclusions which carry significant meaning, however, are those that can be supported by an overwhelming body of evidence.

Strengthening any line of reasoning reduces to three simple steps:

1.) Depersonalize the message.  Remove as many references to the first person as possible.  Statements which are heavily supported should appear to originate from a body of authors, not just one.

2.) Objectification.  Strip any statements which are primarily (or entirely) opinion.  Rewrite as many of the basic elements in relative terms (avoid references to absolute concepts).

3.) Reduction and reinforcement.  Condense the reasoning to its most basic premises and primary conclusion.  Use the strongest pieces of evidence available, and drop those that are weak or easily refuted.  Continue to add new information as it is discovered.

Quote from: thomas
"Human noticeable" means one million times, by the way.


This is quite the arbitrary definition.  It is proper to at least display what train of thought generated it; otherwise there is little reason to respond to such a statement.

Quote from: thomas
Blaming bash or an intermediate api layer as responsible for running an identical program 4.6 times slower (we are talking about 4.6 times, not 4.6 percent) is hilarious. Even more so as the hardware on the Windows system is superior in every respect. It could very well burn a few extra CPU cycles, no one would notice.


bash itself is only indirectly the problem in my hypothesis.  It is the layers necessary to run bash on Windows which introduce the actual dilemma.  If one is to operate scientifically, it is necessary to remove as many alternate causal explanations as possible before drawing a definite deduction.

Reiteration of several facts is certainly warranted by this point:

1.) This individual was proposed a question, given the task of determing an alternative cause to a phenomenon.  The particular phenomenon has not been reproduced in any fixed environment or by any objective observer.

2.) An alternate explanation was provided, but immediately rejected as absurd -- without any evidence.

3.) The burden of proof lies on he who presented the original assertion.  The true problem here is in the nature of the discussion.  One person is attempting to draw an absolute conclusion from excessively insufficient data, and is using ancedotal evidence as his only actual support.  The other person is trying, and clearly failing miserably, to present the reasons for which there appears to be no objective truth in this situation (nor indeed any other -- "objective truth" is an oxymoron).

In short, the initial assertion has not been provided with nearly enough substantiation to make it a more reasonable opinion than "there is no best".

Quote from: thomas
But I see there is little point in discussing this issue further, you will not agree anyway.


Perhaps my intent was not to prove one position as correct, but rather to show that it is narrow-minded to hold one side as irrefutably correct on an issue which has a wide breadth of experience and knowledge?

Quote from: Luca
And now, some considerations about the OS issue (...)


Very well described.  Your technical knowledge of the situation is greater than my own; therefore your terminology and understanding is more complete.

Offline thomas

  • Administrator
  • Lives here!
  • *****
  • Posts: 3979
Memmory leaks
« Reply #18 on: July 16, 2005, 08:33:30 pm »
Well, thank you for the course in scientific work. :)
I could reply something to a couple of your remarks, but I will abstain. This would lead nowhere.

Also, the discussion went so far off topic, it does not really contribute to the original question (what was it, anyway?). So we're wasting other people's time, which is no good.
"We should forget about small efficiencies, say about 97% of the time: Premature quotation is the root of public humiliation."

Offline kagerato

  • Multiple posting newcomer
  • *
  • Posts: 56
    • kagerato.net
Memmory leaks
« Reply #19 on: July 18, 2005, 01:02:08 am »
The original purpose of the thread was to determine whether Code::Blocks contained some kind of memory leak.  However, the anonymous poster was only making an inquiry -- he/she did not actually pinpoint such a problem.  The guest's system did not suffer any real performance drawback, so this issue became essentially self-nullified.

Threads that spin off on tangents like this one do so often because they become abandoned without definite closure.  At least it was a tangental topic, though.  This poster has seen many a phpBB thread jump from issue to issue with absolutely no correlations whatsoever.

Luca

  • Guest
Memmory leaks
« Reply #20 on: July 18, 2005, 02:56:05 am »
Quote from: kagerato
The guest's system did not suffer any real performance drawback, so this issue became essentially self-nullified.


Alright, but even if he did not experience any performance drawback, the bug (i.e. the memory leak) is still present... I suggest to consider the problem!

Regards,
Luca

Offline rickg22

  • Lives here!
  • ****
  • Posts: 2283
Memmory leaks
« Reply #21 on: July 18, 2005, 05:27:49 am »
i thought there weren't memory leaks but it was the behavior of MS Windows...

Offline thomas

  • Administrator
  • Lives here!
  • *****
  • Posts: 3979
Memmory leaks
« Reply #22 on: July 18, 2005, 01:37:23 pm »
Quote from: rickg22
i thought there weren't memory leaks but it was the behavior of MS Windows...

Yes, Sir!  My saying.
"We should forget about small efficiencies, say about 97% of the time: Premature quotation is the root of public humiliation."

Luca

  • Guest
Memmory leaks
« Reply #23 on: July 18, 2005, 05:02:42 pm »
Quote from: rickg22
i thought there weren't memory leaks but it was the behavior of MS Windows...


I haven't run C::B long enough to observe its memory usage increasing. But if this happens, then I strongly believe there ARE memory leaks, as I explained in a previous post...

Luca

Offline rickg22

  • Lives here!
  • ****
  • Posts: 2283
Memmory leaks
« Reply #24 on: July 18, 2005, 05:35:56 pm »
Keep in mind that there WAS a memory leak in 1.0-betafinal, due to some popup menus being created and not destroyed...

Offline kagerato

  • Multiple posting newcomer
  • *
  • Posts: 56
    • kagerato.net
Memmory leaks
« Reply #25 on: July 18, 2005, 05:44:44 pm »
Quote from: rickg22
Keep in mind that there WAS a memory leak in 1.0-betafinal, due to some popup menus being created and not destroyed...


Over an extended usage period, that certainly could make a difference.  I think the problem here has been solved and everyone is running in circles...