Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
RE: [cdt-dev] RIP Wascana, Build System discussion

Hi Chris,

Yes indeed, you make very good points.  But we're not venturing into new territory here -- we're already there ;)  We use this model already in our projects.

We have run into these issues with unstable build file formats from time to time, but the context from the client, e.g., SDK or build configuration, is usually enough to distinguish what file format versions should be expected and generated.  And our parser/rewriter model handles unknown content, so new language features (or even errors) don't make it fail.  And yes, it is a lot like refactoring... :)

The problems with usability do coincide with your points in places (e.g., sometimes there are cases where the plugin does not know how to use a new feature and does not generate it; formatting can get a little goofed up from time to time; etc).  None of these have been intractable issues, though.

The main issue in the model as a whole, as you mention, is not being able to control the build as finely as one desires.  It's difficult in some cases to perform operations on single files, for example, unless the build program can be invoked with such specificity.  But that's mostly out of our control, given the other goals to allow builds and full editing from the command line.

I'll leave it up to others to decide what model they want to use.  My proposals are intended to encapsulate some experience from our projects, so that if someone goes down this path with CDT in the future, they won't need to re-discover the design patterns themselves.

-- Ed

While superficially this all sounds great, I think you will find you will run into some problems over time.

Round-trip editing between the build scripts/data files is only going to work so long as those files are of a known, stable format. While I would say by and large the build systems you mention are relatively mature right now, it is entirely possible that some new whiz-bang feature of the external build system comes out, for which the user has created their build scripts by hand, and then tries importing them into Eclipse. Until your Eclipse plugins are brought up to speed to support whatever new features show up in those files, your whole build system could end up broken.

These external build systems don't typically put any version info into their scripts, e.g. that they were written for version 1 of the tool versus version 2. There are also multiple conventions that can be used in the same family of build script, e.g. it's up to me to know that my makefile uses GNU make syntax and not POSIX make syntax... there's no tag or anything in the file itself that definitively and unambiguously says "I am a POSIX makefile". This is all because these are external build systems and they were not desgined more or less in a vacuum and were never intended to be integrated with any tooling.

For a stable, mature build system, it could work. But for how long? We've already had problems with CygWin changing the path formats their GNU make implementation will accept, which has broken some stuff in managed build for some time now, and GNU make is in theory a relatively stable format. But then again, is it? Although there hasn't been a new version of GNU make since 2006, I'm sure that eventually there will be a new version that accepts some new syntax.

The standard make tools we have now work well because they don't try to do a whole lot. If the makefile parser gets screwed up, right now the worst that happens is your makefile is syntax highlighted incorrectly in the editor, or the outline view doesn't look like you expect it to. The project still builds, and life goes on. But if you generate the wrong data in the makefile, the build will no longer work. Then there's also the issues of trying to match the user's conventions and cosmetics... you end up having to deeply model their build file to the detail level that you would need for a refactoring tool.

Microsoft has a much easier time with all this stuff because they control the entire system from end to end... the compiler, the build system (whether internal to the IDE or at the command line), and the IDE. Even still, they gave up on generating nmake files from .dsp files in VC7 because they couldn't get it right despite having 100% control over everything in the system. They've since switched to a command line builder (msbuild) that knows how to directly build from your project files. We on the other hand have NO control over any build system other than the MBS internal builder. Hence we run into a lot of problems.

I'm not going to say you shouldn't try if you are so inclined, but this is a HARD problem. You basically have all the difficulties we already have with managed build, and are introducing MORE problems (and more difficult ones!) to solve. Personally, I'm not sure I would be brave enough to go there.

Chris Recoskie
Team Lead, IBM CDT and RDT
IBM Toronto

Sent by: cdt-dev-bounces@xxxxxxxxxxx

05/22/2009 12:14 PM
Please respond to
"CDT General developers list." <cdt-dev@xxxxxxxxxxx>





RE: [cdt-dev] RIP Wascana, Build System discussion

Hi all,

Regarding building outside CDT, here some ideas based on use cases in our embedded (Symbian and Linux) products, which could make CDT more friendly to external build systems.

We use, variously, autotools (though not Jeff's autotools builder, for reasons mentioned by Doug), qmake, make, and Symbian's build system, but want stronger and more unified UI integration. We've had to do a lot of custom and repetitive work for this.

What if, instead of a completely hands-off model, like the standard make model where the user manually edits Makefiles in an editor, or a completely automated model like MBS that involves full control of compiler/linker invocation, there were an external build model that provides architecture for wrapping existing build systems, such as autotools, qmake, scons, etc. and integrating the build system's model into the UI?
The goal of such a model, as others have mentioned, would be to let the user switch back and forth between the IDE and the command line freely and to let the user share projects from source control with developers not using IDEs. But unlike standard make or managed make, it should automate the updating of build scripts from resource changes and update build configuration data from build script changes, without completely taking over the work of the build system or relying on build-time parsing to gather configuration data.

In addition to the build support, there should be UI integration support for a custom common project navigator model derived from the build scripts. I.e. the external build system should be able to influence or control the way the CDT project navigator model is generated. We're still hammering out UI issues with presenting the project model that's really built versus the tree of files that happen to be living in the project directory. CDT's source folders alone don't cut it as an organization strategy :) We've resorted to ancillary views for this, reinventing the wheel to attach copy/rename/move/build actions, support filtering, etc (gahh).

The builder support, I think, would be minimal: a custom command-line builder, similar to the standard make builder, which controls the build by invoking the external tool with build scripts/files from the project. It could handle incremental/full/clean builds similarly to standard make as well, with appropriate vendor configuration.

Extensions would be implemented and invoked by CDT to (1) handle synchronizing changes between the resource model and the build scripts by interpreting resource change events, (2) generate build configuration data and indexer settings by parsing build scripts, (3) react to modification, creation, and deletion of build scripts, (4) analyze resource change events to determine whether a recompile or relink or reindex is needed, and (5) generate/modify the CDT's project navigator model.

These extensions, in fact, need not be tied whatsoever to the external build model.

Of course, a lot of work must go into the implementation of the extensions, to handle parsing and rewriting build scripts. But by being standard extensions to CDT, standard implementations could be written once (e.g. managed make + standard make, autotools, qmake, etc).

Would it be appropriate for CDT to provide support like this? Were you thinking something along these lines?

-- Ed

From: Treggiari, Leo [mailto:leo.treggiari@xxxxxxxxx]
Sent: Friday, May 22, 2009 10:27 AM
To: CDT General developers list.
Subject: RE: [cdt-dev] RIP Wascana, Build System discussion

> I'm not sure though that the MBS was ever built to support integrations like this. It was intended to behave just as Visual C++ does, which also doesn't deal well with external build tools. It's important that we keep that in mind.

We didn't consider building outside of CDT enough in the MBS design. I guess the thought was that you could take the generated makefile and use that outside of Eclipse. I don't know if that works at all in practice.

In the last major build model update, Mikhail Sennikovsky tried to separate out a generic build model from the MBS. There are pieces of functionality that are not MBS specific, for example allowing a tool chain to specify an "environment provider" for any build system.

By the way, VC++ will play nicely with external builds in VS 2010. A VC++ 2010 project file is an MSBuild file and so can be used directly by MSBuild outside of Visual Studio. VC++ also has a new method of integrating another C++ toolset. It's not as flexible as MBS and is almost exclusively "declarative" - you have to write XML files using the MSBuild schema and XAML files to describe property pages.

From: cdt-dev-bounces@xxxxxxxxxxx [mailto:cdt-dev-bounces@xxxxxxxxxxx] On Behalf Of Doug Schaefer
Sent: Friday, May 22, 2009 7:56 AM
To: CDT General developers list.
Subject: Re: [cdt-dev] RIP Wascana, Build System discussion

On Fri, May 22, 2009 at 4:34 AM, Alex Blewitt <alex.blewitt@xxxxxxxxx<mailto:alex.blewitt@xxxxxxxxx>> wrote:
On Fri, May 22, 2009 at 9:05 AM, Jesper Eskilson <jesper.eskilson@xxxxxx<mailto:jesper.eskilson@xxxxxx>> wrote:
> - It does not have any decent support for offline/headless builds.
> - It integrates badly with other build systems. If you build your code base
> around MBS, you force people to use CDT. This makes CDT/MBS a much harder
> sell than if CDT was easy to integrate with other build systems.
This is one of the killers for the Eclipse plugin build process as
well. It's pretty much impossible to do decent headless builds outside
of Eclipse (and the build.xml is painful to the extreme) but it pretty
much ties you/inhibits you to do things the way PDE is expecting
(META-INF at root of plugin etc). That's one of the reasons you don't
see much OSGi dev outside of the Eclipse space.

So as nice as having an internal build system is, having the ability
to run builds externally (and with independent tools) is also a good
way of supporting a build process. It would be interesting to know how
CDT would do in e.g. the face of './configure' type scripts which
generate a bunch of these things on the fly - a GUI wrapper around the
args needed for the ./configure script woul dbe handy.

Interesting enough the Red Hat guys have an Autoconf plugin which does exactly that. My effort on the Makefile project side will take that into account and make sure the workflows are clean.

I'm not sure though that the MBS was ever built to support integrations like this. It was intended to behave just as Visual C++ does, which also doesn't deal well with external build tools. It's important that we keep that in mind.

And that's why I want to make it easier, or even provide integrations for these external build tools to make it easier for users to adopt. But it won't be based on MBS. I think it would be very difficult to push the build model to do these kind of things. I saw Jeff from Red Hat try with his autoconf thing. It wasn't pretty.


cdt-dev mailing list
cdt-dev mailing list

GIF image

GIF image

Back to the top