Hello Speedreadersteve.
Initially you wrote:
... since I compiled a 500+ line file - mostly of comments and ~200 lines code, a lot of statements in main() - it's taken a consistent 6 seconds. ...
Please keep in mind that the compiler is not only processing the code lines written by you but all code lines in the header-files you have included. And this may result in your issue by just small changes in your self written code.
Before the compiler starts the translation of the file the preprocessor will extend your own code.
- Every include line will be replaced by the complete content of the included file. And if an included file contains include-commands by itself the preprocessor is replacing this include commands by the corresponding file content also.
- If you use preprocessor-defines (maybe provided by an used library) the corresponding CONSTANTS and MACROS, used in your code, will be replaced by the code used while the definition with their #define statement.
- This is just a very rough and general description. But I hope you understand that already the use of the #include and #define directives could increase the workload of the complier even your own code my just contain less then 10 lines.
Especially today many modern libraries for example provided by the boost-project are based on meta-programming. But you may already have a similar effect if you just use a STL (Standard Template Library) provided by a standard C++ build suite.
- This means that the library may contain just header-files, which define templates only. This means before actively using this libraries, they could not be build, since especially data-types are just defined by placeholders.
- By using them you define the specific data-type for each placeholder in your own code and based on this definition the compiler is now creating the final code of the library object you want to use.
- This is increasing the workload of your compiler and you could compare it with the preprocessor workload.
Furthermore, if you use a meta-programming based library or a STL in a header, every cpp file that is including this header directly or indirectly will be affected by a combination of both effects.
I hope that this will give you an idea how you could blow up the build effort just by adding 1 or 2 lines of code.
You may reduce your resulting problems by a pre-compilation of your header files as suggested by the other discussion participants. Thus, please follow their advises.
However, you may already limit the initial root-cause by a refactoring of your software design.
- Limit the dependencies between your sources as far as possible. Whenever you are able to eliminate the use of an other module or class you could eliminate the corresponding including of a header.
- Split big header files when ever possible. If you have a header that will be included by 10 other files but every including file is only using 10% of the headers content, try to split your header in 10 smaller headers with the goal that every using source is just including one new smaller header with the needed information.
- If possible include headers in cpp files but not in other headers to reduce indirect includes.
Do you already use
Doxygen? together with the tool Dot from
Graphviz ? This tool is able to provide you for every of your source-file an include graph that shows you not only the direct but the indirect included files. This graphs will help you to detect the most complex include dependencies.
Best regards,
Eckard Klotz.
PS.: Dear forum admin. I know, actually this is not a programming discussion. But I face very often similar issues in my own projects and had to learn over the time that the reason is not the tool (Code::Blocks, build-suite, ...) but in the most cases the software design. Thus I hope sharing my experiences will help to reduce the users frustration a little bit.