Author Topic: Compile N files at once  (Read 22703 times)

Offline Ceniza

  • Developer
  • Lives here!
  • *****
  • Posts: 1441
    • CenizaSOFT
Compile N files at once
« on: August 10, 2005, 07:49:24 pm »
I'm currently compiling the latest CVS HEAD version of Code::Blocks and I see it's only using 50% to 70% of the CPU when it could be using 100% (so it would compile faster). Using make you could use -j# (like -j3) to compile 3 files at once but I couldn't find where to add that option either.

Have you ever considered this option?

Offline rickg22

  • Lives here!
  • ****
  • Posts: 2283
Re: Compile N files at once
« Reply #1 on: August 10, 2005, 08:16:10 pm »
I usually get 100% CPU usage from GCC when compiling...

Offline thomas

  • Administrator
  • Lives here!
  • *****
  • Posts: 3979
Re: Compile N files at once
« Reply #2 on: August 10, 2005, 08:17:34 pm »
I usually get 100% CPU usage from GCC when compiling...
Second that. Also, parallel compilation will have to deal with dependencies.
"We should forget about small efficiencies, say about 97% of the time: Premature quotation is the root of public humiliation."

Offline Ceniza

  • Developer
  • Lives here!
  • *****
  • Posts: 1441
    • CenizaSOFT
Re: Compile N files at once
« Reply #3 on: August 10, 2005, 09:05:33 pm »
The problem here is that GCC mainly uses only 1 of the CPUs (HT) so I want it to use 100% :)

Offline rickg22

  • Lives here!
  • ****
  • Posts: 2283
Re: Compile N files at once
« Reply #4 on: August 10, 2005, 09:43:08 pm »
You could generate a makefile, open two command prompts, and compile different targets on each.

takeshimiya

  • Guest
Re: Compile N files at once
« Reply #5 on: August 10, 2005, 09:59:03 pm »
Very funny solution to do manual multithreading :lol:


But that solution wouldn't work if your project haves only 1 target and you have 2 or more cpu/cores.   :)
« Last Edit: August 10, 2005, 10:05:47 pm by takeshimiya »

Offline thomas

  • Administrator
  • Lives here!
  • *****
  • Posts: 3979
Re: Compile N files at once
« Reply #6 on: August 11, 2005, 12:59:39 am »
Thinking about it again, actually what I said is bull...
Dependencies are an issue with cluster compilation, but we have all files on the disk anyway. So the only thing that may have a problem is the linker which can of course only start after all jobs are finished.

Could be an interesting feature for the future. The compiler in c::b runs asynchronously anyway, and the next job is taken from the queue when the event sent by PipedProcess::OnTerminate is processed. What if there was a semaphore whose value the user could configure (1, 2, 3, whatever), and as long as there are jobs remaining and DoRunQueue() can acquire the semaphore, another compile process is started. It is only an idea, but it might just work.

Two problems only:
1. The linker has to know when to start, i.e. there must be some kind of counter for the total number of source files
2. Compiler output. How do you receive messages from 2 or 3 threads which may alternately send you stuff via stdout or sterr? One process may fail, the other may run fine. One could pass a number via wxCommandEvent::SetExtraLong, maybe. Then the complete output of one respective job could be buffered and appended to the log in one block.
"We should forget about small efficiencies, say about 97% of the time: Premature quotation is the root of public humiliation."

Offline rickg22

  • Lives here!
  • ****
  • Posts: 2283
Re: Compile N files at once
« Reply #7 on: August 11, 2005, 02:29:25 am »
The only logical solution is to make GCC multithreaded :P

Offline Ceniza

  • Developer
  • Lives here!
  • *****
  • Posts: 1441
    • CenizaSOFT
Re: Compile N files at once
« Reply #8 on: August 11, 2005, 04:19:09 am »
So, there is no way to add -j3 (without editing the source) when using Makefiles? That would solve the problem for me but I would really like to see it implemented when calling the compiler directly.

takeshimiya

  • Guest
Re: Compile N files at once
« Reply #9 on: August 11, 2005, 06:42:31 am »
The "-j" option is for make Parallel Execution of jobs http://www.gnu.org/software/make/manual/html_mono/make.html#SEC55
But for direct compilation from Code::Blocks, it (of course) doesn't make any use of make, so C::B must support multithreading when compiling directly.
Right now, it's currently multithreaded the compilation in C::B??

Offline thomas

  • Administrator
  • Lives here!
  • *****
  • Posts: 3979
Re: Compile N files at once
« Reply #10 on: August 11, 2005, 08:43:18 am »
Right now, it's currently multithreaded the compilation in C::B??
Code::Blocks calls the compiler asynchronously with a PipedProcess object (derived from wxProcess). The compiler runs in its own thread and the application is not blocking.
PipedProcess reads stdin and stderr from the compiler regularly (using a 100 ms timer) and is notified of compiler termination by wx via OnTerminate(). All compiler output and termination notification is passed to the application via a number of wxCommandEvents (or cbEvents? don't remember, but does not matter much anyway).

The fact that compilation *still* runs "synchronously" is artificial because the next job is started only after the event telling that the last job has finished is being processed. So basically, it would be a snap to run any number of compiles in parallel, all you really have to think about thoroughly is how to synchronize with the linker and how to prevent an awful mess on the output screen.
"We should forget about small efficiencies, say about 97% of the time: Premature quotation is the root of public humiliation."

grv575

  • Guest
Re: Compile N files at once
« Reply #11 on: August 11, 2005, 07:57:45 pm »
Have a main compile thread which blocks till all the OnTerminate() calls complete.  Then have the main thread call the linker (no need to paralleliize just the link step).  Output is still debatable -- should only the error messages from the first thread to generate errors be shown?  Kill all compile threads when this happens?  Need to make the output polling use a thread id then as a tag?

Offline thomas

  • Administrator
  • Lives here!
  • *****
  • Posts: 3979
Re: Compile N files at once
« Reply #12 on: August 11, 2005, 08:22:24 pm »
One efficient solution for output would be to skip sending stdout and stderr events. Instead, concat everything to one string, and send that through OnTerminate(). If the exit code is error, then empty the job queue. The remaining processes will terminate normally (killing them is not a good idea), and the queue will end.

Having a main thread is no good idea in my opinion, as everything should be handled the same way, exceptions are always bad. One could for example have static counter = queue.GetCount(); when compilation starts, and decrement counter in OnTerminate(). If counter reaches zero, post a "start linking now" message to the message queue. This way, all jobs are equal, no special cases, no exceptions.
"We should forget about small efficiencies, say about 97% of the time: Premature quotation is the root of public humiliation."

Offline rickg22

  • Lives here!
  • ****
  • Posts: 2283
Re: Compile N files at once
« Reply #13 on: August 11, 2005, 08:32:31 pm »
Why not just use makefiles and make a CB patch to specify additional parameters for make?

grv575

  • Guest
Re: Compile N files at once
« Reply #14 on: August 11, 2005, 08:33:33 pm »
makefile -j parameter doesn't even work on DOS.  It only invokes additional jobs on unix.