Developer forums (C::B DEVELOPMENT STRICTLY!) > Plugins development
Query on the Philosophy of the Build System
Stevo:
Hi,
Before I start, let me say this:
1. I think you guys have done a great job so far with C::B
2. I really like C::B so far (bugs aside) :)
3. This post isnt a flame or a troll, but im really confused about the intent of the Build System, it isnt meant to be an attack on anyone or an insult to their effort on this project, so please dont take it that way.
4. Im happily bypassing the integrated build system, so this post isnt a gripe about that.
C::B has this really (from my perspective) awkward build system. it seems, that it can do simple things really simply, but cant do complex things very easily at all, nor (as far as i can tell) will it ever be able to manage complex projects easily. So I wonder to myself, whats the point? Obviously a lot of effort is being put into it, and im sure the developers have their reasons, its just that i cant see them, and it seems to me to make certain parts of the C::B project overly complicated.
As anyone who has read my (admitedly few) previous posts knows, I use build systems like Jam, and Scons, and on occasion autotools/make.
Each of these have their strengths and weaknesses, and i dont propose to debate their relative merits, or try to convince anyone to use or not use any particular tool (C::B included).
The projects I develop have multiple sources of data, they automatically create code from things like FPGA bit streams, they encapsulate graphics files as c files, then link them with code, they incorporate assembler code, they link to non standard libraries, they run various utilities to package the end results for my targets (which arent PC's), etc. Once I get an elf from my build, im only half way to having a usable program. Now to set this up under C::B's build system might be possible, but it would be tedious.
Further, because there is no "build" script as such, I have nothing to save with the project so i can re-build the project in 6 months, 12 months or 2 years time. I would need to save a copy of the C::B i used to build it, to guarantee i could rebuild it later from the saved C::B Project files (in case C::B has an incompatible change which i cant guarantee it wont). Further, if i was to re-distribute my source (say to a customer, or a government regulator) then i would have to give them a copy of C::B to use to build the application, whcih would be a more difficult prospect than having them install SCons, or Jam, or Make and just run the build script through it.
Also, I dont see how the current build system handles things like (but not limited to):
1. Building mixed language applications.
2. Handling custom dependency scanning.
3. Handling a build, such that a file will always be recompiled if its code changes (dates are not enough, what if the file hasnt changed, but the compiler flags for it have? What if you are building from a network share, and it has date synchronisation problems)
4. Autogenerating c code from custom data files (in a custom way), and then compiling the result.
5. Only doing 4, if the data files change (relates to 2).
6. Post processing a build elf (Or whatever) output file from the link (or similar) stage.
7. Handling Autoconf-like support for finding #include files, libraries, functions and typedefs depending on the target being built on.
8. Rebuilding if the map file (generated during the link) is missing. (Which means re-linking, even if the output .ef still exists)
9. Easily re-using common custom elements across projects.
What all of this boils down to for me is:
1. I dont see how the integrated build system can be used for Open Source software, because it would force any prospective contributors to use C::B to build it, any they may want to use Vi or Emacs, or something else, and so would limit user contribution. Or otherwise to maintain a makefile, and to independently maintain the C::B project. I know C::B can autogenerate makefiles (at least in theory, ive never done it), its still sub optimal, because if a contributor adds a file, the maintainer has to back track their changes to the make file, into the C::B script and re-generate. Also, then there is room for error, because the Makefile may not generate the exact same build as the C::B build system does.
2. I dont see how it can be (at least easily) used for programs that need a repeatable build system, or have complex build requirements.
3. The only application (i see) is for simple (maybe university style) projects, that have limited life and applicability. It may be very easy for those types of projects, but it has little applicibility to (as far as i can see) many real world programming problems.
I cant see the point, but surely there must be one, because a lot of unpaid (but not unapreciated) time and effort is devoted to it. So I gues the questions are,
Whats the philosophy behind it?
What am i missing?
What problem does the integrated build system seek to solve?
Thanks for reading this far.
Stevo
thomas:
Is today April the first? Sorry, but I really can't believe you are serious. :lol:
In particular the argument about being unfit for open source development because someone might want to use vi strikes me...
The only way to solve this issue is to not use an IDE at all and write all your makefiles by hand. No matter what IDE you use, and no matter how sophisticated it is, you will always find someone unwilling to use it, and you will always have the problem of having to back-port something that someone has written. The only way to get around this is if everybody uses the least common denomiter (i.e. everybody uses vi and make).
It is your decision, however, whether you prefer to live in 2006 or in 1978. Personally, I prefer to click the blue gear, and whenever I have to manually edit a config file anywhere, there's this voice in my head saying "duh... couldn't they do it the easy way".
Also, I am surprised what makes you think that installing Jam is easier than installing Code::Blocks. Well, it is a matter of taste, maybe. Personally, I perceive Jam (despite being a good build system otherwise) as an extreme bitch to build and set up.
On the other hand, your fear about future incompatibilities of project files is quite understandable. Considering that my Fedora Core 4 DVD contains four different versions of automake which are not compatible with each other, this fear seems justified... :)
As far as autogenerating data is concerned, I have to wonder what makes you believe that Code::Blocks is unfit for that purpose. I use it for that purpose every day.
As it happens, Code::Blocks even uses an auto-generated header in its own build process (even though running the autorelease tool is rather a very modest example, but it is nevertheless a proof of concept).
I am working on a proprietary project about twice the code size of Code::Blocks, using Code::Blocks and using a lot of auto-generated or auto-processed data. Sure enough, you have to do some scripting here and there to make everything work seamlessly, but it is really not a problem.
You got me quite amazed if your projects are so big that you find Code::Blocks unsuitable to handle them... are your projects in the dimension of OpenOffice?
But now for the one important question:
If you think that the Code::Blocks build system is unfit for some purpose or lacks a vital functionality, can you name a different IDE which has this functionality (so we can have a look at it) or do you have a proposal of what is missing and how it could be implemented?
It is not like we aren't willing to go for it if you can name what exactly is missing ;)
280Z28:
@stevo
I think the best reply is you use yours, and I'll continue using mine and making it better for what I'm using it for. :umm: :umm:
Game_Ender:
I think you have really missed a point here. You can still get almost all of the IDE functionality of Code::Blocks and still use another buid system. You can just "enable custom makefiles" and then tell it what make commands to run for a target. You can then have your elaborate collection of make/jam files (which you must have) do the work. As long as you tell Code::Blocks where the final binary went you can still debug it too.
Now if you were to use Code::Blocks' build system you have several option
* You can run arbitrary pre and post build shell commands, which could... anything.
* You can set a target to be dependent on external files.
* Soon you will be able to use build hook scripts written in AngelScript to modify compile settings based on platform.
So with that said, we are down to what your fear of your build system going obsolete?
takeshimiya:
--- Quote ---1. Building mixed language applications.
--- End quote ---
From what I understand, this depends on the compiler. I've read in the forums that someone is using Fortran and C mixed, in Code::Blocks, without problems.
--- Quote ---2. Handling custom dependency scanning.
--- End quote ---
What is a "custom dependency scanning"? Inter-project dependencies?
--- Quote ---3. Handling a build, such that a file will always be recompiled if its code changes (dates are not enough, what if the file hasnt changed, but the compiler flags for it have? What if you are building from a network share, and it has date synchronisation problems)
--- End quote ---
You mean, using signatures instead of dates. I don't know which one C::B uses.
--- Quote ---4. Autogenerating c code from custom data files (in a custom way), and then compiling the result.
--- End quote ---
This can be done as thomas said, with a target, and having it as first target.
Or you can use AngelScript in this case (altrough I preffer the first solution).
--- Quote ---5. Only doing 4, if the data files change (relates to 2).
--- End quote ---
Same as 4. You can do just the same.
--- Quote ---6. Post processing a build elf (Or whatever) output file from the link (or similar) stage.
--- End quote ---
You can do that using pre/post build steps.
--- Quote ---7. Handling Autoconf-like support for finding #include files, libraries, functions and typedefs depending on the target being built on.
--- End quote ---
Ok, this is the most important one, and your entire post can be resumed to this.
I expect that somewhere in the future, C::B could export automake/autoconf projects.
But this is really a picky subject. Because of two facts:
1) Autotools are found in every posix system. They almost works everytime, very easy and standard way from a USER point of view.
2) Autotools are a pain in the ass to maintain from a DEVELOPER point of view. And they suck bad. :lol:
So it's a debate of doing things "the standard way (on *nix)" vs. "another (better) way".
Not an easy topic, but this needs to be discussed more.
--- Quote ---8. Rebuilding if the map file (generated during the link) is missing. (Which means re-linking, even if the output .ef still exists)
--- End quote ---
It doesn't do this just now?
--- Quote ---9. Easily re-using common custom elements across projects.
--- End quote ---
This is also the other very important topic. I'm working on this, at a inter-platform level. Not an easy topic neither.
Navigation
[0] Message Index
[#] Next page
Go to full version