Developer forums (C::B DEVELOPMENT STRICTLY!) > Plugins development

Query on the Philosophy of the Build System

<< < (2/3) > >>

Game_Ender:

--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---9. Easily re-using common custom elements across projects.
--- End quote ---
This is also the other very important topic. I'm working on this, at a inter-platform level. Not an easy topic neither.

--- End quote ---

That is what the new build script hooks are for.  You can just write one script and use it in multiple projects.

takeshimiya:

--- Quote from: Game_Ender on January 24, 2006, 05:51:46 pm ---
--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---9. Easily re-using common custom elements across projects.
--- End quote ---
This is also the other very important topic. I'm working on this, at a inter-platform level. Not an easy topic neither.

--- End quote ---

That is what the new build script hooks are for.  You can just write one script and use it in multiple projects.

--- End quote ---

Yes, but if you do it that way (AngelScript hooks for the build system) it will be more difficult to export them to autotools/makefiles/whatever.
The real problem is autoconf.

Stevo:
Hi again,


--- Quote from: thomas on January 24, 2006, 03:24:23 pm ---It is your decision, however, whether you prefer to live in 2006 or in 1978. Personally, I prefer to click the blue gear, and whenever I have to manually edit a config file anywhere, there's this voice in my head saying "duh... couldn't they do it the easy way".

--- End quote ---
Obviously I dont use VI, or i wouldnt be posting here :)  But if you develop an open source project, your contributors, if you are lucky to get them will come with all sorts of baggage (maybe thats my problem).  I was just commenting that it seems easier to me to get them to install another command line tool, rather than to require them to use a particular total gui environment.  It might be a barrier to entry if on your web site you say "To compile and develop with this project you need to download C::B and install that, and use that, then they come here, and see its a big GUI thing, and that "puts them off" because they are an Emacs or Vi user (and there are a lot of them around).  Whereas an Emacs or Vi user isnt likely to be put off by using another command line tool, because it doesnt interefere in their own development mind set.  Thats what i was trying to get at.


--- Quote from: thomas on January 24, 2006, 03:24:23 pm ---Also, I am surprised what makes you think that installing Jam is easier than installing Code::Blocks. Well, it is a matter of taste, maybe. Personally, I perceive Jam (despite being a good build system otherwise) as an extreme bitch to build and set up.

--- End quote ---
Agreed, it can be a bitch to set up your build files correctly, but that is mostly due to a lack of good documentation I find. However to install for an end user who just wants to build your project from source, it is a snap, it is a small exe, that they can put anywhere, and it will just run.  It doesnt require any Gui Toolkit to be installed, or library matching, or anything.


--- Quote from: thomas on January 24, 2006, 03:24:23 pm ---On the other hand, your fear about future incompatibilities of project files is quite understandable. Considering that my Fedora Core 4 DVD contains four different versions of automake which are not compatible with each other, this fear seems justified... :)

--- End quote ---
Yes, in my environment i need a 100% replicatable build environment, what that comes down to, is with a project, I archive the code of the Compiler, all Libraries and Tools im using to generate the code with the project, so that in 3 years or more, i can come back to it if i need to and get the same binary from the source.  It also means I can give my development environment to a government regulator and tell them how to install it, and type ./build in the root of the project and the output will be exactly the same binary/binaries as ive submitted for approval.


--- Quote from: thomas on January 24, 2006, 03:24:23 pm ---As far as autogenerating data is concerned, I have to wonder what makes you believe that Code::Blocks is unfit for that purpose. I use it for that purpose every day.
[snip]
But now for the one important question:
If you think that the Code::Blocks build system is unfit for some purpose or lacks a vital functionality, can you name a different IDE which has this functionality (so we can have a look at it) or do you have a proposal of what is missing and how it could be implemented?
It is not like we aren't willing to go for it if you can name what exactly is missing ;)

--- End quote ---

Id say C::B is about as good as a "Custom Built into the IDE" build systems get, they all have their strengths and weaknesses, and this post wasnt meant to be a comparitive review of the merits or otherwise of C::B versus its competitors.  Its just that they all seem to be re-inventing the wheel, without a clear purpose for improving the nature of software engineering.  Ive never been an advocate of "Any App ive come across" does this cool thing, so we should to, feature creep.  A Build System is a very complex thing to get right, and my post was mostly about the philosophy, meaning what problems are trying to be solved, whats the theory in solving those problems, how is it envisioned that this system will be better than what has come before for software engineering.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---1. Building mixed language applications.
--- End quote ---
From what I understand, this depends on the compiler. I've read in the forums that someone is using Fortran and C mixed, in Code::Blocks, without problems.

--- End quote ---

I was more thinking about 2 different compilers, that generate compatible object code, which can be linked by an appropriate linker.  Not that ive tried, but it seems not very straight forward to do, because the compiler setup is project wide.  This is also, i would agree an uncommon thing to do, but it is done.  I would expect the person using Fortran and C is using GCC, which is really one compiler with 2 (or more) language front ends.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---2. Handling custom dependency scanning.
--- End quote ---
What is a "custom dependency scanning"? Inter-project dependencies?

--- End quote ---

Ok, Consider this ficticious example, for a description of what im getting at:

I have a utility called "MyFileManipulator"  it takes a file, which lists a bunch of operations to do with other files, adn generates an output file.
You would call it thus:
MyFileManipulator output input_script

an example input_script might look like this:


--- Code: ---Add ../Data/MyMenuImage.png
DefinesFrom MyGlobalDefines.h
Add ../Data/MyMenuFont.png
Shrink ../Resources/ABigLogo.jpg 80,48
EncodeIntoLsBit ../Resources/ATestPattern.png "This is a test pattern"
Concatenate FurtherManipulatedFiles.mlist

--- End code ---

The Build system for this then goes:
I Build ManipulatedData.dat from FilesToManipulate.lst by calling MyFileManipulator
I can see that ManipulatedData.dat is up to date with FilesToManipulate.mlist, BUT
this isnt enough, because ManipulatedData.dat is also dependent on the files listed in FilesToManipulate.mlist, however
I dont know how to process FiilesToManipulate.lst to get the dependencies from it out (like i automatically do for c files, by scanning for #include) the user has howerver provided me with a script (or regex expression, or something) I can use to extract those dependencies (for this file).  So using the above example, the script gives the build system the following list of dependent files:

MyGlobalDefines.h
../Data/MyMenuFont.png
../Resources/ABigLogo.jpg 80,48
../Resources/ATestPattern.png
FurtherManipulatedFiles.mlist

The the build system can then check if any of these have changed, or files they depend on have changed.  MyGlobalDefines.h (being a .h file) would be processed by the standard .h file dependency checker. the FurtherManipulatedFiles.mlist would be checked by the custom mlist dependency scanner.  And so on, until there were no more dependencies to check, if any of them change, then the ManipulatedData.dat file would be regenerated.

Without this, the developer needs to keep in his head, ahh, ive changed a file that my auto generated data relies upon, but the build system cant recognise, so i either need to do a clean build all or i need to delete ManipulatedData.dat.  The problem with this is it can be forgotten, you build, its a clean build, but its not an uptodate build.

Also, you dont want to have to go and tick a box in a dialog that says "rebuild this file if this file now selected changes" because that relies on a process of "Ok, ive edited the .mlist file, now ive got to go and make sure all of the dependiencies in the build system are spelled out".  That process is error prone.  A Build system should handle the deatils, once you have specified the rules.  Sometimes its not possible to autoscan a dependency, so you absolutely must force it, but those instances should be mimimised, and everything absolutely possible should be (able to be) automated.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---3. Handling a build, such that a file will always be recompiled if its code changes (dates are not enough, what if the file hasnt changed, but the compiler flags for it have?  What if you are building from a network share, and it has date synchronisation problems)
--- End quote ---
You mean, using signatures instead of dates. I don't know which one C::B uses.
--- End quote ---
I dont see any evidence of C::B using signitures, because usually they are cached somewhere.  Yes, i think signatures are a better approach, but it is a matter of philosophy, there may be an even better approach (i dont know).  But if you use signatures, then what do they contain?  How do you cache them so you are not un-neccesarily generating signitures for unmodified code?  How do you handle signatures for your script that decide what to build and how?


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---4. Autogenerating c code from custom data files (in a custom way), and then compiling the result.
--- End quote ---
This can be done as thomas said, with a target, and having it as first target.
Or you can use AngelScript in this case (altrough I preffer the first solution).

--- End quote ---
But the problem as i understand it with both of these approaches currently is it would require the custom data to be re-generated every build.  Some of my custom data takes minutes to generate (on a very fast computer) and i dont want to do it if i dont need to.  I suppose i could code a script to work out if the file needs to change explicitly, but then the build engine isnt doing anything for me, because it isnt aware of this extra work.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---6. Post processing a build elf (Or whatever) output file from the link (or similar) stage.
--- End quote ---
You can do that using pre/post build steps.
--- End quote ---
True, but again, they would be executed every time.  Also, they may be derived from a number of .elf's.  Again, I see this as targets that get built after the .elf file/s in my case they are more important than the .elf files (even though they may be built from them).  Again, they shouldnt re-build if the dependencies on them havent changed.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---7. Handling Autoconf-like support for finding #include files, libraries, functions and typedefs depending on the target being built on.
--- End quote ---
Ok, this is the most important one, and your entire post can be resumed to this.
I expect that somewhere in the future, C::B could export automake/autoconf projects.

But this is really a picky subject. Because of two facts:
1) Autotools are found in every posix system. They almost works everytime, very easy and standard way from a USER point of view.
2) Autotools are a pain in the ass to maintain from a DEVELOPER point of view. And they suck bad. :lol:

--- End quote ---
Agreed and Agreed.  The single biggest argument people put up to using "Autoconf" tools over any other method is "There are a lot of M4 scripts already written for Autoconf".  To my mind, thats not as important as having a build system that can deal with the sorts of problems Autoconf is designed to address.  If it can, then a developer is free to add their own tests, and a new standard test library can be built to make a developers life easy.  Im also not convinced most of the auto tests are worthwhile, but that is a seperate subject altogether.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---So it's a debate of doing things "the standard way (on *nix)" vs. "another (better) way".
Not an easy topic, but this needs to be discussed more.
--- End quote ---
And this is the cruxt of my post (even if i failed to express it well).  If C::B is going to re-invent the build system, it needs to make sure its a "better way" or there isnt a lot of point to it.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---8. Rebuilding if the map file (generated during the link) is missing. (Which means re-linking, even if the output .ef still exists)
--- End quote ---
It doesn't do this just now?

--- End quote ---
This is a 2 Targets from 1 set of source problem, i dont see how to set that up in C::B.

In a generic form

XC output.1 output.2 list of inputs

XC generates both output.1 and output.2 from the list of inputs.  XC needs to be called to regenerate both output.1 and output.2 if either are out of date with the source.  Most build systems assume 1 output from a list of inputs, which in the case of a link isnt the case (you get a link map (if you pass the right options) and the linked executable, for me, both are equally important to the development process.


--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---9. Easily re-using common custom elements across projects.
--- End quote ---
This is also the other very important topic. I'm working on this, at a inter-platform level. Not an easy topic neither.

--- End quote ---
[/quote]
No not an easy subject with a GUI interface, and to be clear, im not talking about 2 variants of the same tree, im talking about taking things like "custom dependency scanners", "auto code generation scripts", "custom post processing sequences", "autoconf like stuff" from one project and sticking it in another (completely unrelated) project, where the only similarity is where the sorts of build processing remain similar.

Game_Ender:
You have spent at lot of time writing but a little bit more time look would shown you that anything in the build system options can be set on an IDE, Project, or target basis.  This include compiler.  So you could have your project compile under 5 different compilers and then a 6th custom one to link it all together.  Just go to build options, click you target and change "Selected Compiler".

About the tool chain issue.  I think it would be a good idea to look into a way to make a stand alone codeblocks build system.  I think there would be some difficulty in a completely standalone system because for better or worse the entire project uses, including the debugger plugin uses wxstrings, wxchars, and the wxwidgets event system.  With that said, you could probably make a build system that had no GUI but still needed wxwidgets around but didn't have a GUI a little easier.

Urxae:

--- Quote from: Stevo on January 25, 2006, 01:35:28 am ---I was just commenting that it seems easier to me to get them to install another command line tool, rather than to require them to use a particular total gui environment.

--- End quote ---

C::B can be used from the command line. I do so every day. (Actually, I use a shortcut. But that's just for convenience)


--- Quote ---  It might be a barrier to entry if on your web site you say "To compile and develop with this project you need to download C::B and install that, and use that, then they come here, and see its a big GUI thing, and that "puts them off" because they are an Emacs or Vi user (and there are a lot of them around).  Whereas an Emacs or Vi user isnt likely to be put off by using another command line tool, because it doesnt interefere in their own development mind set.  Thats what i was trying to get at.

--- End quote ---

If you don't want to tell people where to get C::B, feel free to bundle it.
And there's no need to use the GUI if you don't want to. You'll need to use it at least once to tell it where the compiler is (or at least confirm the auto-detect) and optionally to disable plugins not needed to compile, but after that you don't ever need to open the GUI to build a project.


--- Quote ---
--- Quote from: thomas on January 24, 2006, 03:24:23 pm ---Also, I am surprised what makes you think that installing Jam is easier than installing Code::Blocks. Well, it is a matter of taste, maybe. Personally, I perceive Jam (despite being a good build system otherwise) as an extreme bitch to build and set up.

--- End quote ---
Agreed, it can be a bitch to set up your build files correctly, but that is mostly due to a lack of good documentation I find. However to install for an end user who just wants to build your project from source, it is a snap, it is a small exe, that they can put anywhere, and it will just run.  It doesnt require any Gui Toolkit to be installed, or library matching, or anything.

--- End quote ---

Okay, so C::B isn't contained in one file, and still needs wxWidgets to be present even if the GUI isn't used. I'll give you that one. However, no installer is necessary (It'll request vital information like where the compiler is on first run) and it can be put anywhere.
Running it is easy, and can be made even easier by a one-line script that provides the right parameters.


--- Quote ---Yes, in my environment i need a 100% replicatable build environment, what that comes down to, is with a project, I archive the code of the Compiler, all Libraries and Tools im using to generate the code with the project, so that in 3 years or more, i can come back to it if i need to and get the same binary from the source.  It also means I can give my development environment to a government regulator and tell them how to install it, and type ./build in the root of the project and the output will be exactly the same binary/binaries as ive submitted for approval.

--- End quote ---

If you need to provide the complete build environment, I presume that includes the compiler (esp. if you want a bitwise-equal result). My compiler is about 27 MB compressed (MinGW, zipped). C::B + wxWidgets is about 6 MB zipped. Doesn't seem like a huge overhead in comparison (though admittedly probably more than make or jam will take up).


--- Quote ---
--- Quote from: Takeshi Miya on January 24, 2006, 05:48:01 pm ---
--- Quote ---1. Building mixed language applications.
--- End quote ---
From what I understand, this depends on the compiler. I've read in the forums that someone is using Fortran and C mixed, in Code::Blocks, without problems.

--- End quote ---

I was more thinking about 2 different compilers, that generate compatible object code, which can be linked by an appropriate linker.  Not that ive tried, but it seems not very straight forward to do, because the compiler setup is project wide.  This is also, i would agree an uncommon thing to do, but it is done.  I would expect the person using Fortran and C is using GCC, which is really one compiler with 2 (or more) language front ends.

--- End quote ---

(Note: Compiler can be set on a per-target basis)
What's the fundamental difference between two compilers with compatible output or two frontends for the same compiler?
I'd bet that if I put together a project that created a library and an executable, with different but compatible-output compilers it'd link together just fine. After all, that's the definition of "compatible output", is it not?
Unfortunately though, I only have one compiler installed so I'm not in a position to try.
Oh, and this does not allow directly linking object files generated by different compilers together. So what? That's what static libraries are for (amongst other things).

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version