Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[] Question regarding numeric types

Ed, all,

I'm wondering how OCL semantics for numeric types interferes with Java's definition. We had this discussion before. But here's a thought that I think hasn't been brought forward yet.

If we integrate OCL with Ecore/EMF then on the EMF side there are collection types and their constraints, derived from the Ecore multiplicity settings. For example, there is UniqueEList. It's add/contains is based on regular Java equality.

With this, an EObject many-feature that is modeled as unique can easily hold EDoubleObject and EIntegerObject values that are not equal according to Java semantics but will be equal according to standard OCL semantics. If an OCL PropertyCallExp accesses such a feature, an inconsistent OCL collection will result, inconsistent in one of two ways:

- if OCL chooses to collapse values equal to each other according to OCL semantics, the property's cardinality will differ from the EMF feature's cardinality

- if OCL leaves the values distinct according to Java semantics in place, the collection is inconsistent from OCL's point of view because it should be unique but has two distinct values equal according to OCL semantics.

From an EMF/Ecore perspective, the most pragmatic way out of this dilemma seems to be to alter OCL semantics such that they comply with Java semantics. In particular, this would make 3.0 <> 3 which might come as a shock to the OCL purist but would probably be fairly intuitive to Java/EMF/Ecore consumers.

The implications, of course, are horrible in another sense. OCL expressions would not be entirely portable across environments. But: is this a dominant use case? What would it look like? How would I obtain and re-use a significant body of OCL-implemented libraries in a platform context different from the one where it was developed?

-- Axel

Back to the top