Peter Kriens <Peter.Kriens@xxxxxxxx> wrote on
12/02/2005 03:29:06 AM:
> Bundle dependencies miss substitutability. A bundle with
> bundle symbolic name X, version Y must always have the same
> constituents. Packages, especially specification packages, can be
> substituted at will. More important, it allows different providers
> provide a "part" of the solution.
You made a leap there. The substitution notion
typically comes up in the area of substituting implementations. You
can of course have multiple suppliers of standard API packages (e.g., javax.servlet,...).
To do either of these developers need to engineer their solution
such that their API are phrased in terms of interfaces or minimal implementation
classes that are separate from the actual full implementation. If
they don't, the package substitution approach does not work.
So to summarize how this works for package level dependencies.
If you break up your packages into API packages and implementation
packages, folks can depend on the API packages and the implementation can
be substituted. Similarly, it really doesn't matter who supplies
the API packages since the package id and version are supposed to uniquely
identify the content.
Just for fun, repeat the above para and substitute
"bundle" for "package". Now repeat the above
para and substitute "feature" for "package". Gee,
this substitutability is fun! ;-) There is engineering work needed
to enable the substitution but that is true in all cases.
So the problem is not in the mechanism or grain of
dependency management but in how people design/implement the elements.
Of course, in Eclipse we have not set a particularly good example
but that doesn't mean the mechanism/notion is flawed. The original
problem that started this chain was wanting to subset some bundles for
eRCP. Indeed it is a problem. Note however that it is also
entirely likely that they would want to subset some of the packages in
It is a matter of grain. And for each person
that says one grain is too big/small, we can find another who says the
opposite. It depends on usecase and need. Finer grain =>
more flexible but higher complexity. Coarser grain => less flexibility
> We are running the full OSGi build (>750 bundles
in total) with full
> version numbering and we rarely have a version problem. We have a
> packageinfo file in each package directory that holds the version.
> have btool that picks up those numbers and builds the manifests import
> and export clauses.
I'm very curious as to how btool can build the import
clauses. How does it know the minimum value to use for an import
In any event, where/how you define the version numbers
for exports is not the problem. The problem is that a human has to
determine when and how to increment the version number. I suspect
that the OSGi build case is somewhat special in that (please correct where
I am wrong) - there are actually relatively few exported packages
since the only API at play is the OSGi spec which is actually quite small. - the version numbers of those packages increment
once every couple years as a new spec is published - there are a small number of people maintaining and
managing those export version numbers
> Version ranges are much harder because you have
to think about
> semantics as you say. However, the only one that can something useful
> about this is the exporter and we decided to NOT have export ranges.
Actually I was refering to the importers. As
the version numbers of exports increase, people who write code need to
ensure that the minimum in their import version range correctly captures
their requirements. For example, if org.osgi.foo goes from 1.0 to
1.1 becuase some type T is added, an importer of foo needs to decide if
they actually use T. If they do, they have to adjust their import
range to start at 1.1. If not, they can leave it at 1.0. This
is all fine if you have API packages that rarely change. The environment
experienced by OSGi folks spec'ing the standard or implementing standards,
is quite friendly for this approach. The unwashed masses however
experience rapidly evolving API and the continual challenge of ensuring
their version ranges are correct.
Again, don't get me wrong. I LIKE the finer
grained approach. The problem is that it is a large amount of work
and as such, needs to be motivated by actual needs to warrant the cost.