No offense, but this is plain bullshit. I just edited a 37.5 MB file with 494,000 lines in Code::Blocks to verify this claim. The same block of lines copied and pasted many times, then saved to disk.
On my machine, opening such a ridiculously big file takes about 8-9 seconds, and there is a lag of about half a second while editing. Cutting 50 lines from the middle of the document and pasting in another place (arbitrary position) takes no longer than half a second, either.
Deleting the first 1,000 lines from this document takes about a second. Deleting the next 10,000 lines (after the first 1,000) takes about 12 seconds. Saving that document takes about 3 seconds.
Editing a 2.2 MB file (which is still outright ridiculous) has no noticeable delay at all (cutting and pasting a block the size of about 50% of the text takes under a second).
Similar figures for SciTE, except that SciTE has no noticeable lag at the beginning of arbitrarily large documents (to be honest, I only verified "arbitrarily large" with a 80 MB document). Instead, editing gets delayed only at the end of the document. However, for any reasonable document size (under 5 MB), there is no noticeable delay at all.
How can you call this "bad management"? Honestly, I could not possibly write an editor that performs any better than this (or which even gets near that). Can you?
If you seriously consider editing source files whose sizes are on the order of several megabytes, you have a problem. It is not your editor, though...
EDIT: I forgot to mention that the above numbers include the overhead for the code completion parser, which consumes massive CPU and memory, too. And yet it still works acceptable at file sizes that are really unrealistic (whose source files are >2 MB?).