Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [cdt-dev] Future of Managed Build


Op 30/01/2020 om 23:03 schreef Liviu Ionescu:
Given the high level of the discussion at this moment, probably it is not exactly the right moment, but I would like to point that once the high level tools are defined, we'll have to come down to earth and somehow run the build.
I have that in mind all the time. Do you see problems?

The classical choice is to use make, but I strongly suggest you to also consider ninja.

I fully agree. It is very important that we agree on what comes out of the model when provided with the project data. This because I believe there will always be a need for "post processing" and the "post processing" will not be done by the processing team. This means that the starting point for someone willing to do "post processing" should be a well documented api we agree on before we start implementing.

The way I see things (and that is just me) is that the new managed build produces something like a ordered list of "wait flag,list of input files, list of output files,list of ordered commands".

A single threaded full build would then simply need to execute all the commands in the order it gets one after the other.

Other contributors could convert the list into a makefile, ninja,cmake or whatever.

Note that the model I described allows constructions that can not be supported in make. I think we all agree that we do not want to limit future technology with past implementations.

Actually I would design a system to first fit the absolute basic simplicity of ninja, and then eventually port the generator to make.

I have been trained as an architect so I like to do the opposite. Start from high level very complex and drill down. This is what I'm trying to do now:

1: agree on the model functionality
2: agree on how the programmer will have to work with the modeller
3: agree on what should come out of the model/project component
4: agree on what will be part of what
5: agree on who will do what/when

As you can see ... there is lots to agree on. Don't forget... we are talking about replacing a very old part of CDT which is basically still unchallenged. We will need all the help we can get and the more we align the easier (or is it less hard?) and quicker we can get to a good solution. We can only agree on 3 if you know for sure you can make the cmake and ninja files. Your expertise is needed.

It took me some time to understand this, and perhaps you can save some time. Ninja, by design, claims to be very, very simple and fast: "Where other build systems are high-level languages Ninja aims to be an assembler." This is so much true, and so much a beautiful design decision, in my opinion.

	https://ninja-build.org


Personally I'll make ninja the default in my xPack Builder (which currently generates ninja and make files, end exports to .cproject), and I'll also try to get inspiration from the ninja design philosophy in the xPacks general design.

As a practical step, I'll soon add to the xPack Dev Tools a binary package with the ninja binaries, compiled for all major platforms (linux intel 32/64, linux arm 32/64, windows 32/64, macOS 64), to complement the toolchains binary packages.

---

Another observation: although possible, I guess a very high level tool like you are planning to build might also generate configuration files for cMake and meson, but I think this is a bit counter productive, since these tools will require another step to generate make or ninja files and then run the build.

I'm not agreeing here. First of all it is not me, I want to get as many people on board and do this together. Secondly: Alexander announced a toolchain modelling tool for another project and suggested this would be reusable for CDT. I agree with him that such a tool should also be able to cater any toolchain. I'm currently trying to define the CDT requirement for such a modelling tool. I hope Alexander and co will succeed. If so we will have the modelling "for free" (well kind of). I think it is important to have our requirements written down so Alexander can take these to the project and alarm us in case our requirements will not be met.

I have no idea what 2 and 3 will look like. Maybe Alexander has no idea. I also have no idea how much our input is appreciated. For you and for me (as Sloeber designer) 3 is extremely important as it is the deciding factor whether we will be able to generate the files we need with the quality we want. Backup plan: If the Alexander modelling route fails (earthquake,corona virus...) but we know what the model and meta data should look like we can model it using xml like the current managed build does. Still producing the same 3.

Contrary to ninja, which was designed to have its input file generated by a machine, both cMake and meson were designed to have their input files written by humans, and, even if you manage to generate them with a tool, it'll be extremely difficult to further edit them with a tool.
I have asked the question here on the dev list:"Do we have the requirement of readable makefiles". I think -as long as the build works- nobody really cares whether it is make, cmake, ninja who is doing the build lifting. If you disagree please elaborate.
Hope this helps.

It does

Jantje


Liviu






_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://www.eclipse.org/mailman/listinfo/cdt-dev


Back to the top